1
|
Sühn T, Esmaeili N, Spiller M, Costa M, Boese A, Bertrand J, Pandey A, Lohmann C, Friebe M, Illanes A. Vibro-acoustic sensing of tissue-instrument-interactions allows a differentiation of biological tissue in computerised palpation. Comput Biol Med 2023; 164:107272. [PMID: 37515873 DOI: 10.1016/j.compbiomed.2023.107272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2023] [Revised: 06/26/2023] [Accepted: 07/16/2023] [Indexed: 07/31/2023]
Abstract
BACKGROUND The shift towards minimally invasive surgery is associated with a significant reduction of tactile information available to the surgeon, with compensation strategies ranging from vision-based techniques to the integration of sensing concepts into surgical instruments. Tactile information is vital for palpation tasks such as the differentiation of tissues or the characterisation of surfaces. This work investigates a new sensing approach to derive palpation-related information from vibration signals originating from instrument-tissue-interactions. METHODS We conducted a feasibility study to differentiate three non-animal and three animal tissue specimens based on palpation of the surface. A sensor configuration was mounted at the proximal end of a standard instrument opposite the tissue-interaction point. Vibro-acoustic signals of 1680 palpation events were acquired, and the time-varying spectrum was computed using Continuous-Wavelet-Transformation. For validation, nine spectral energy-related features were calculated for a subsequent classification using linear Support Vector Machine and k-Nearest-Neighbor. RESULTS Indicators derived from the vibration signal are highly stable in a set of palpations belonging to the same tissue specimen, regardless of the palpating subject. Differences in the surface texture of the tissue specimens reflect in those indicators and can serve as a basis for differentiation. The classification following a supervised learning approach shows an accuracy of >93.8% for the three-tissue classification tasks and decreases to 78.8% for a combination of all six tissues. CONCLUSIONS Simple features derived from the vibro-acoustic signals facilitate the differentiation between biological tissues, showing the potential of the presented approach to provide information related to the interacting tissue. The results encourage further investigation of a yet little-exploited source of information in minimally invasive surgery.
Collapse
Affiliation(s)
- Thomas Sühn
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany; SURAG Medical GmbH, Leipzig, Germany.
| | | | | | - Maximilian Costa
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany.
| | - Axel Boese
- INKA-Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University, Magdeburg, Germany.
| | - Jessica Bertrand
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany.
| | - Ajay Pandey
- Queensland University of Technology, School of Electrical Engineering & Robotics, Brisbane, Australia.
| | - Christoph Lohmann
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany.
| | - Michael Friebe
- INKA-Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University, Magdeburg, Germany; AGH University of Science and Technology, Department of Measurement and Electronics, Kraków, Poland; CIB - Center of Innovation and Business Development, FOM University of Applied Sciences, Essen, Germany.
| | | |
Collapse
|
2
|
Sühn T, Esmaeili N, Mattepu SY, Spiller M, Boese A, Urrutia R, Poblete V, Hansen C, Lohmann CH, Illanes A, Friebe M. Vibro-Acoustic Sensing of Instrument Interactions as a Potential Source of Texture-Related Information in Robotic Palpation. SENSORS (BASEL, SWITZERLAND) 2023; 23:3141. [PMID: 36991854 PMCID: PMC10056323 DOI: 10.3390/s23063141] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 03/02/2023] [Accepted: 03/09/2023] [Indexed: 06/19/2023]
Abstract
The direct tactile assessment of surface textures during palpation is an essential component of open surgery that is impeded in minimally invasive and robot-assisted surgery. When indirectly palpating with a surgical instrument, the structural vibrations from this interaction contain tactile information that can be extracted and analysed. This study investigates the influence of the parameters contact angle α and velocity v→ on the vibro-acoustic signals from this indirect palpation. A 7-DOF robotic arm, a standard surgical instrument, and a vibration measurement system were used to palpate three different materials with varying α and v→. The signals were processed based on continuous wavelet transformation. They showed material-specific signatures in the time-frequency domain that retained their general characteristic for varying α and v→. Energy-related and statistical features were extracted, and supervised classification was performed, where the testing data comprised only signals acquired with different palpation parameters than for training data. The classifiers support vector machine and k-nearest neighbours provided 99.67% and 96.00% accuracy for the differentiation of the materials. The results indicate the robustness of the features against variations in the palpation parameters. This is a prerequisite for an application in minimally invasive surgery but needs to be confirmed in realistic experiments with biological tissues.
Collapse
Affiliation(s)
- Thomas Sühn
- Department of Orthopaedic Surgery, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
- SURAG Medical GmbH, 39118 Magdeburg, Germany
| | | | - Sandeep Y. Mattepu
- INKA Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
| | | | - Axel Boese
- INKA Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
| | - Robin Urrutia
- Instituto de Acústica, Facultad de Ciencias de la Ingeniería, Universidad Austral de Chile, Valdivia 5111187, Chile
| | - Victor Poblete
- Instituto de Acústica, Facultad de Ciencias de la Ingeniería, Universidad Austral de Chile, Valdivia 5111187, Chile
| | - Christian Hansen
- Research Campus STIMULATE, Otto-von-Guericke University Magdeburg, 39106 Magdeburg, Germany
| | - Christoph H. Lohmann
- Department of Orthopaedic Surgery, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
| | | | - Michael Friebe
- INKA Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
- Department of Measurement and Electronics, AGH University of Science and Technology, 30-059 Kraków, Poland
- CIB—Center of Innovation and Business Development, FOM University of Applied Sciences, 45127 Essen, Germany
| |
Collapse
|
3
|
Directional touch sensing for stiffness singularity search in an object using microfinger with tactile sensor. Sci Rep 2022; 12:21374. [PMID: 36494492 PMCID: PMC9734658 DOI: 10.1038/s41598-022-25847-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2022] [Accepted: 12/06/2022] [Indexed: 12/13/2022] Open
Abstract
Palpation is widely used as the initial medical diagnosis. Integration of micro tactile sensors and artificial muscles enables a soft microfinger for active touch sensing using its bending actuation. Active touch sensing by pushing-in motion of microfinger enables to evaluate stiffness distribution on an elastic object. Due to its compactness, the microfinger can enter a narrow space, such as gastrointestinal and abdominal spaces in a body. However, a microfinger can only touch and sense limited points. We aim at efficient method for searching a stiffness singular part in an elastic object by the directional touch sensing of a microfinger. This study presents a microfinger for active touch sensing using bending and push-in actuation and proposes an algorithm utilizing directivity in touch sensing by a microfinger for efficient localization of the stiffness singular part in an object. A gelatin block structure with a small rigid ball was prepared and touch sensed by the microfinger. Consequently, the position of the buried rigid ball could be efficiently identified based on the proposed algorithm. This result implies that the proposed method has potential applications in endoscopic medical diagnosis, particularly in identifying tumor positions.
Collapse
|
4
|
Gumbs AA, Grasso V, Bourdel N, Croner R, Spolverato G, Frigerio I, Illanes A, Abu Hilal M, Park A, Elyan E. The Advances in Computer Vision That Are Enabling More Autonomous Actions in Surgery: A Systematic Review of the Literature. SENSORS 2022; 22:s22134918. [PMID: 35808408 PMCID: PMC9269548 DOI: 10.3390/s22134918] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Revised: 06/21/2022] [Accepted: 06/21/2022] [Indexed: 12/28/2022]
Abstract
This is a review focused on advances and current limitations of computer vision (CV) and how CV can help us obtain to more autonomous actions in surgery. It is a follow-up article to one that we previously published in Sensors entitled, “Artificial Intelligence Surgery: How Do We Get to Autonomous Actions in Surgery?” As opposed to that article that also discussed issues of machine learning, deep learning and natural language processing, this review will delve deeper into the field of CV. Additionally, non-visual forms of data that can aid computerized robots in the performance of more autonomous actions, such as instrument priors and audio haptics, will also be highlighted. Furthermore, the current existential crisis for surgeons, endoscopists and interventional radiologists regarding more autonomy during procedures will be discussed. In summary, this paper will discuss how to harness the power of CV to keep doctors who do interventions in the loop.
Collapse
Affiliation(s)
- Andrew A. Gumbs
- Departement de Chirurgie Digestive, Centre Hospitalier Intercommunal de, Poissy/Saint-Germain-en-Laye, 78300 Poissy, France
- Department of Surgery, University of Magdeburg, 39106 Magdeburg, Germany;
- Correspondence: ; Tel.: +33-139274873
| | - Vincent Grasso
- Family Christian Health Center, 31 West 155th St., Harvey, IL 60426, USA;
| | - Nicolas Bourdel
- Gynecological Surgery Department, CHU Clermont Ferrand, 1, Place Lucie-Aubrac Clermont-Ferrand, 63100 Clermont-Ferrand, France;
- EnCoV, Institut Pascal, UMR6602 CNRS, UCA, Clermont-Ferrand University Hospital, 63000 Clermont-Ferrand, France
- SurgAR-Surgical Augmented Reality, 63000 Clermont-Ferrand, France
| | - Roland Croner
- Department of Surgery, University of Magdeburg, 39106 Magdeburg, Germany;
| | - Gaya Spolverato
- Department of Surgical, Oncological and Gastroenterological Sciences, University of Padova, 35122 Padova, Italy;
| | - Isabella Frigerio
- Department of Hepato-Pancreato-Biliary Surgery, Pederzoli Hospital, 37019 Peschiera del Garda, Italy;
| | - Alfredo Illanes
- INKA-Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany;
| | - Mohammad Abu Hilal
- Unità Chirurgia Epatobiliopancreatica, Robotica e Mininvasiva, Fondazione Poliambulanza Istituto Ospedaliero, Via Bissolati, 57, 25124 Brescia, Italy;
| | - Adrian Park
- Anne Arundel Medical Center, Johns Hopkins University, Annapolis, MD 21401, USA;
| | - Eyad Elyan
- School of Computing, Robert Gordon University, Aberdeen AB10 7JG, UK;
| |
Collapse
|
5
|
Gumbs AA, Frigerio I, Spolverato G, Croner R, Illanes A, Chouillard E, Elyan E. Artificial Intelligence Surgery: How Do We Get to Autonomous Actions in Surgery? SENSORS (BASEL, SWITZERLAND) 2021; 21:5526. [PMID: 34450976 PMCID: PMC8400539 DOI: 10.3390/s21165526] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Revised: 08/03/2021] [Accepted: 08/11/2021] [Indexed: 12/30/2022]
Abstract
Most surgeons are skeptical as to the feasibility of autonomous actions in surgery. Interestingly, many examples of autonomous actions already exist and have been around for years. Since the beginning of this millennium, the field of artificial intelligence (AI) has grown exponentially with the development of machine learning (ML), deep learning (DL), computer vision (CV) and natural language processing (NLP). All of these facets of AI will be fundamental to the development of more autonomous actions in surgery, unfortunately, only a limited number of surgeons have or seek expertise in this rapidly evolving field. As opposed to AI in medicine, AI surgery (AIS) involves autonomous movements. Fortuitously, as the field of robotics in surgery has improved, more surgeons are becoming interested in technology and the potential of autonomous actions in procedures such as interventional radiology, endoscopy and surgery. The lack of haptics, or the sensation of touch, has hindered the wider adoption of robotics by many surgeons; however, now that the true potential of robotics can be comprehended, the embracing of AI by the surgical community is more important than ever before. Although current complete surgical systems are mainly only examples of tele-manipulation, for surgeons to get to more autonomously functioning robots, haptics is perhaps not the most important aspect. If the goal is for robots to ultimately become more and more independent, perhaps research should not focus on the concept of haptics as it is perceived by humans, and the focus should be on haptics as it is perceived by robots/computers. This article will discuss aspects of ML, DL, CV and NLP as they pertain to the modern practice of surgery, with a focus on current AI issues and advances that will enable us to get to more autonomous actions in surgery. Ultimately, there may be a paradigm shift that needs to occur in the surgical community as more surgeons with expertise in AI may be needed to fully unlock the potential of AIS in a safe, efficacious and timely manner.
Collapse
Affiliation(s)
- Andrew A. Gumbs
- Centre Hospitalier Intercommunal de POISSY/SAINT-GERMAIN-EN-LAYE 10, Rue Champ de Gaillard, 78300 Poissy, France;
| | - Isabella Frigerio
- Department of Hepato-Pancreato-Biliary Surgery, Pederzoli Hospital, 37019 Peschiera del Garda, Italy;
| | - Gaya Spolverato
- Department of Surgical, Oncological and Gastroenterological Sciences, University of Padova, 35122 Padova, Italy;
| | - Roland Croner
- Department of General-, Visceral-, Vascular- and Transplantation Surgery, University of Magdeburg, Haus 60a, Leipziger Str. 44, 39120 Magdeburg, Germany;
| | - Alfredo Illanes
- INKA–Innovation Laboratory for Image Guided Therapy, Medical Faculty, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany;
| | - Elie Chouillard
- Centre Hospitalier Intercommunal de POISSY/SAINT-GERMAIN-EN-LAYE 10, Rue Champ de Gaillard, 78300 Poissy, France;
| | - Eyad Elyan
- School of Computing, Robert Gordon University, Aberdeen AB10 7JG, UK;
| |
Collapse
|
6
|
Sühn T, Spiller M, Salvi R, Hellwig S, Boese A, Illanes A, Friebe M. Auscultation System for Acquisition of Vascular Sounds - Towards Sound-Based Monitoring of the Carotid Artery. MEDICAL DEVICES-EVIDENCE AND RESEARCH 2020; 13:349-364. [PMID: 33162758 PMCID: PMC7642592 DOI: 10.2147/mder.s268057] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2020] [Accepted: 09/23/2020] [Indexed: 11/23/2022] Open
Abstract
Introduction Atherosclerotic diseases of the carotid are a primary cause of cerebrovascular events such as stroke. For the diagnosis and monitoring angiography, ultrasound- or magnetic resonance-based imaging is used which requires costly hardware. In contrast, the auscultation of carotid sounds and screening for bruits - audible patterns related to turbulent blood flow - is a simple examination with comparably little technical demands. It can indicate atherosclerotic diseases and justify further diagnostics but is currently subjective and examiner dependent. Methods We propose an easy-to-use computer-assisted auscultation system for a stable and reproducible acquisition of vascular sounds of the carotid. A dedicated skin-transducer-interface was incorporated into a handheld device. The interface comprises two bell-shaped structures, one with additional acoustic membrane, to ensure defined skin contact and a stable propagation path of the sound. The device is connected wirelessly to a desktop application allowing real-time visualization, assessment of signal quality and input of supplementary information along with storage of recordings in a database. An experimental study with 5 healthy subjects was conducted to evaluate usability and stability of the device. Five recordings per carotid served as data basis for a wavelet-based analysis of the stability of spectral characteristics of the recordings. Results The energy distribution of the wavelet-based stationary spectra proved stable for measurements of a particular carotid with the majority of the energy located between 3 and 40 Hz. Different spectral properties of the carotids of one individual indicate the presence of sound characteristics linked to the particular vessel. User-dependent parameters such as variations of the applied contact pressure appeared to have minor influence on the general stability. Conclusion The system provides a platform for reproducible carotid auscultation and the creation of a database of pathological vascular sounds, which is a prerequisite to investigate sound-based vascular monitoring.
Collapse
Affiliation(s)
- Thomas Sühn
- INKA - Innovation Laboratory for Image Guided Therapy, Medizinische Fakultät, Otto-Von-Guericke-Universität, Magdeburg, Sachsen-Anhalt, Germany
| | - Moritz Spiller
- INKA - Innovation Laboratory for Image Guided Therapy, Medizinische Fakultät, Otto-Von-Guericke-Universität, Magdeburg, Sachsen-Anhalt, Germany
| | - Rutuja Salvi
- IDTM GmbH, Castrop-Rauxel, Nordrhein-Westfalen, Germany
| | | | - Axel Boese
- INKA - Innovation Laboratory for Image Guided Therapy, Medizinische Fakultät, Otto-Von-Guericke-Universität, Magdeburg, Sachsen-Anhalt, Germany
| | - Alfredo Illanes
- INKA - Innovation Laboratory for Image Guided Therapy, Medizinische Fakultät, Otto-Von-Guericke-Universität, Magdeburg, Sachsen-Anhalt, Germany
| | - Michael Friebe
- INKA - Innovation Laboratory for Image Guided Therapy, Medizinische Fakultät, Otto-Von-Guericke-Universität, Magdeburg, Sachsen-Anhalt, Germany
| |
Collapse
|
7
|
Illanes A, Schaufler A, Sühn T, Boese A, Croner R, Friebe M. Surgical audio information as base for haptic feedback in robotic-assisted procedures. CURRENT DIRECTIONS IN BIOMEDICAL ENGINEERING 2020. [DOI: 10.1515/cdbme-2020-0036] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Abstract
This work aims to demonstrate the feasibility that haptic information can be acquired from a da Vinci robotic tool using audio sensing according to sensor placement requirements in a real clinical scenario. For that, two potential audio sensor locations were studied using an experimental setup for performing, in a repeatable way, interactions of a da Vinci forceps with three different tissues. The obtained audio signals were assessed in terms of their resulting signal-to-noise-ratio (SNR) and their capability to distinguish between different tissues. A spectral energy distribution analysis using Discrete Wavelet Transformation was performed to extract signal signatures from the tested tissues. Results show that a high SNR was obtained in most of the audio recordings acquired from both studied positions. Additionally, evident spectral energy-related patterns could be extracted from the audio signals allowing us to distinguish between different palpated tissues.
Collapse
Affiliation(s)
- Alfredo Illanes
- Otto-von-Guericke University Magdeburg, Medical Faculty , Magdeburg , Germany
| | - Anna Schaufler
- Otto-von-Guericke University Magdeburg, Medical Faculty , Magdeburg , Germany
| | - Thomas Sühn
- Otto-von-Guericke University Magdeburg, Medical Faculty , Magdeburg , Germany
| | - Axel Boese
- Otto-von-Guericke University Magdeburg, Medical Faculty , Magdeburg , Germany
| | - Roland Croner
- Clinic for General, Visceral, Vascular and Transplant Surgery , Otto-von-Guericke University , Magdeburg , Germany
| | - Michael Friebe
- Otto-von-Guericke University Magdeburg, Medical Faculty , Magdeburg , Germany
| |
Collapse
|