1
|
Yu F, Zhiyuan H, Hongxia L, Liu D, Weibo W. A new HCM heart sound classification method based on weighted bispectrum features. Phys Eng Sci Med 2025; 48:207-220. [PMID: 39883386 DOI: 10.1007/s13246-024-01506-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2024] [Accepted: 12/04/2024] [Indexed: 01/31/2025]
Abstract
Hypertrophic cardiomyopathy (HCM), including obstructive HCM and non-obstructive HCM, can lead to sudden cardiac arrest in adolescents and athletes. Early diagnosis and treatment through auscultation of different types of HCM can prevent the occurrence of malignant events. However, it is challenging to distinguish the pathological information of HCM related to differential left ventricular outflow tract pressure gradients. To address this issue, a classification method based on weighted bispectrum features of heart sounds (HSs) is proposed for efficient and cost-effective HCM analysis. Preprocessing is first applied to remove background noise during HS acquisition. Then, the bispectrum contour map is calculated, and 56-dimensional features are extracted to represent the pathological information of HCM. Next, an adaptive threshold weighting mutual information method is proposed for feature selection and weighted fusion. Finally, the CNN-RF classifier model is built to automatically identify different types of HCM cases. A clinical dataset of normal and two types of HCM HSs is utilized for validation. The results show that the proposed method performs well, with a classification accuracy reaching 94.4%. It provides a reliable reference for HCM diagnosis in young patients in clinical settings.
Collapse
Affiliation(s)
- Fang Yu
- School of Electrical Engineering and Electronic Information, Xihua University, Chengdu, China
| | - Huang Zhiyuan
- School of Electrical Engineering and Electronic Information, Xihua University, Chengdu, China
| | - Leng Hongxia
- School of Electrical Engineering and Electronic Information, Xihua University, Chengdu, China
| | - Dongbo Liu
- School of Electrical Engineering and Electronic Information, Xihua University, Chengdu, China.
| | - Wang Weibo
- School of Electrical Engineering and Electronic Information, Xihua University, Chengdu, China
| |
Collapse
|
2
|
Sühn T, Esmaeili N, Spiller M, Costa M, Boese A, Bertrand J, Pandey A, Lohmann C, Friebe M, Illanes A. Vibro-acoustic sensing of tissue-instrument-interactions allows a differentiation of biological tissue in computerised palpation. Comput Biol Med 2023; 164:107272. [PMID: 37515873 DOI: 10.1016/j.compbiomed.2023.107272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2023] [Revised: 06/26/2023] [Accepted: 07/16/2023] [Indexed: 07/31/2023]
Abstract
BACKGROUND The shift towards minimally invasive surgery is associated with a significant reduction of tactile information available to the surgeon, with compensation strategies ranging from vision-based techniques to the integration of sensing concepts into surgical instruments. Tactile information is vital for palpation tasks such as the differentiation of tissues or the characterisation of surfaces. This work investigates a new sensing approach to derive palpation-related information from vibration signals originating from instrument-tissue-interactions. METHODS We conducted a feasibility study to differentiate three non-animal and three animal tissue specimens based on palpation of the surface. A sensor configuration was mounted at the proximal end of a standard instrument opposite the tissue-interaction point. Vibro-acoustic signals of 1680 palpation events were acquired, and the time-varying spectrum was computed using Continuous-Wavelet-Transformation. For validation, nine spectral energy-related features were calculated for a subsequent classification using linear Support Vector Machine and k-Nearest-Neighbor. RESULTS Indicators derived from the vibration signal are highly stable in a set of palpations belonging to the same tissue specimen, regardless of the palpating subject. Differences in the surface texture of the tissue specimens reflect in those indicators and can serve as a basis for differentiation. The classification following a supervised learning approach shows an accuracy of >93.8% for the three-tissue classification tasks and decreases to 78.8% for a combination of all six tissues. CONCLUSIONS Simple features derived from the vibro-acoustic signals facilitate the differentiation between biological tissues, showing the potential of the presented approach to provide information related to the interacting tissue. The results encourage further investigation of a yet little-exploited source of information in minimally invasive surgery.
Collapse
Affiliation(s)
- Thomas Sühn
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany; SURAG Medical GmbH, Leipzig, Germany.
| | | | | | - Maximilian Costa
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany.
| | - Axel Boese
- INKA-Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University, Magdeburg, Germany.
| | - Jessica Bertrand
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany.
| | - Ajay Pandey
- Queensland University of Technology, School of Electrical Engineering & Robotics, Brisbane, Australia.
| | - Christoph Lohmann
- Department of Orthopaedic Surgery, Otto-von-Guericke University/University Hospital, Magdeburg, Germany.
| | - Michael Friebe
- INKA-Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University, Magdeburg, Germany; AGH University of Science and Technology, Department of Measurement and Electronics, Kraków, Poland; CIB - Center of Innovation and Business Development, FOM University of Applied Sciences, Essen, Germany.
| | | |
Collapse
|
3
|
Sühn T, Esmaeili N, Mattepu SY, Spiller M, Boese A, Urrutia R, Poblete V, Hansen C, Lohmann CH, Illanes A, Friebe M. Vibro-Acoustic Sensing of Instrument Interactions as a Potential Source of Texture-Related Information in Robotic Palpation. SENSORS (BASEL, SWITZERLAND) 2023; 23:3141. [PMID: 36991854 PMCID: PMC10056323 DOI: 10.3390/s23063141] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 03/02/2023] [Accepted: 03/09/2023] [Indexed: 06/19/2023]
Abstract
The direct tactile assessment of surface textures during palpation is an essential component of open surgery that is impeded in minimally invasive and robot-assisted surgery. When indirectly palpating with a surgical instrument, the structural vibrations from this interaction contain tactile information that can be extracted and analysed. This study investigates the influence of the parameters contact angle α and velocity v→ on the vibro-acoustic signals from this indirect palpation. A 7-DOF robotic arm, a standard surgical instrument, and a vibration measurement system were used to palpate three different materials with varying α and v→. The signals were processed based on continuous wavelet transformation. They showed material-specific signatures in the time-frequency domain that retained their general characteristic for varying α and v→. Energy-related and statistical features were extracted, and supervised classification was performed, where the testing data comprised only signals acquired with different palpation parameters than for training data. The classifiers support vector machine and k-nearest neighbours provided 99.67% and 96.00% accuracy for the differentiation of the materials. The results indicate the robustness of the features against variations in the palpation parameters. This is a prerequisite for an application in minimally invasive surgery but needs to be confirmed in realistic experiments with biological tissues.
Collapse
Affiliation(s)
- Thomas Sühn
- Department of Orthopaedic Surgery, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
- SURAG Medical GmbH, 39118 Magdeburg, Germany
| | | | - Sandeep Y. Mattepu
- INKA Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
| | | | - Axel Boese
- INKA Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
| | - Robin Urrutia
- Instituto de Acústica, Facultad de Ciencias de la Ingeniería, Universidad Austral de Chile, Valdivia 5111187, Chile
| | - Victor Poblete
- Instituto de Acústica, Facultad de Ciencias de la Ingeniería, Universidad Austral de Chile, Valdivia 5111187, Chile
| | - Christian Hansen
- Research Campus STIMULATE, Otto-von-Guericke University Magdeburg, 39106 Magdeburg, Germany
| | - Christoph H. Lohmann
- Department of Orthopaedic Surgery, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
| | | | - Michael Friebe
- INKA Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany
- Department of Measurement and Electronics, AGH University of Science and Technology, 30-059 Kraków, Poland
- CIB—Center of Innovation and Business Development, FOM University of Applied Sciences, 45127 Essen, Germany
| |
Collapse
|
4
|
Gumbs AA, Grasso V, Bourdel N, Croner R, Spolverato G, Frigerio I, Illanes A, Abu Hilal M, Park A, Elyan E. The Advances in Computer Vision That Are Enabling More Autonomous Actions in Surgery: A Systematic Review of the Literature. SENSORS (BASEL, SWITZERLAND) 2022; 22:4918. [PMID: 35808408 PMCID: PMC9269548 DOI: 10.3390/s22134918] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Revised: 06/21/2022] [Accepted: 06/21/2022] [Indexed: 12/28/2022]
Abstract
This is a review focused on advances and current limitations of computer vision (CV) and how CV can help us obtain to more autonomous actions in surgery. It is a follow-up article to one that we previously published in Sensors entitled, "Artificial Intelligence Surgery: How Do We Get to Autonomous Actions in Surgery?" As opposed to that article that also discussed issues of machine learning, deep learning and natural language processing, this review will delve deeper into the field of CV. Additionally, non-visual forms of data that can aid computerized robots in the performance of more autonomous actions, such as instrument priors and audio haptics, will also be highlighted. Furthermore, the current existential crisis for surgeons, endoscopists and interventional radiologists regarding more autonomy during procedures will be discussed. In summary, this paper will discuss how to harness the power of CV to keep doctors who do interventions in the loop.
Collapse
Affiliation(s)
- Andrew A. Gumbs
- Departement de Chirurgie Digestive, Centre Hospitalier Intercommunal de, Poissy/Saint-Germain-en-Laye, 78300 Poissy, France
- Department of Surgery, University of Magdeburg, 39106 Magdeburg, Germany;
| | - Vincent Grasso
- Family Christian Health Center, 31 West 155th St., Harvey, IL 60426, USA;
| | - Nicolas Bourdel
- Gynecological Surgery Department, CHU Clermont Ferrand, 1, Place Lucie-Aubrac Clermont-Ferrand, 63100 Clermont-Ferrand, France;
- EnCoV, Institut Pascal, UMR6602 CNRS, UCA, Clermont-Ferrand University Hospital, 63000 Clermont-Ferrand, France
- SurgAR-Surgical Augmented Reality, 63000 Clermont-Ferrand, France
| | - Roland Croner
- Department of Surgery, University of Magdeburg, 39106 Magdeburg, Germany;
| | - Gaya Spolverato
- Department of Surgical, Oncological and Gastroenterological Sciences, University of Padova, 35122 Padova, Italy;
| | - Isabella Frigerio
- Department of Hepato-Pancreato-Biliary Surgery, Pederzoli Hospital, 37019 Peschiera del Garda, Italy;
| | - Alfredo Illanes
- INKA-Innovation Laboratory for Image Guided Therapy, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany;
| | - Mohammad Abu Hilal
- Unità Chirurgia Epatobiliopancreatica, Robotica e Mininvasiva, Fondazione Poliambulanza Istituto Ospedaliero, Via Bissolati, 57, 25124 Brescia, Italy;
| | - Adrian Park
- Anne Arundel Medical Center, Johns Hopkins University, Annapolis, MD 21401, USA;
| | - Eyad Elyan
- School of Computing, Robert Gordon University, Aberdeen AB10 7JG, UK;
| |
Collapse
|
5
|
Application of machine learning in wire damage detection for safety procedure. Soft comput 2022. [DOI: 10.1007/s00500-022-06747-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
6
|
Gumbs AA, Frigerio I, Spolverato G, Croner R, Illanes A, Chouillard E, Elyan E. Artificial Intelligence Surgery: How Do We Get to Autonomous Actions in Surgery? SENSORS (BASEL, SWITZERLAND) 2021; 21:5526. [PMID: 34450976 PMCID: PMC8400539 DOI: 10.3390/s21165526] [Citation(s) in RCA: 43] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Revised: 08/03/2021] [Accepted: 08/11/2021] [Indexed: 12/30/2022]
Abstract
Most surgeons are skeptical as to the feasibility of autonomous actions in surgery. Interestingly, many examples of autonomous actions already exist and have been around for years. Since the beginning of this millennium, the field of artificial intelligence (AI) has grown exponentially with the development of machine learning (ML), deep learning (DL), computer vision (CV) and natural language processing (NLP). All of these facets of AI will be fundamental to the development of more autonomous actions in surgery, unfortunately, only a limited number of surgeons have or seek expertise in this rapidly evolving field. As opposed to AI in medicine, AI surgery (AIS) involves autonomous movements. Fortuitously, as the field of robotics in surgery has improved, more surgeons are becoming interested in technology and the potential of autonomous actions in procedures such as interventional radiology, endoscopy and surgery. The lack of haptics, or the sensation of touch, has hindered the wider adoption of robotics by many surgeons; however, now that the true potential of robotics can be comprehended, the embracing of AI by the surgical community is more important than ever before. Although current complete surgical systems are mainly only examples of tele-manipulation, for surgeons to get to more autonomously functioning robots, haptics is perhaps not the most important aspect. If the goal is for robots to ultimately become more and more independent, perhaps research should not focus on the concept of haptics as it is perceived by humans, and the focus should be on haptics as it is perceived by robots/computers. This article will discuss aspects of ML, DL, CV and NLP as they pertain to the modern practice of surgery, with a focus on current AI issues and advances that will enable us to get to more autonomous actions in surgery. Ultimately, there may be a paradigm shift that needs to occur in the surgical community as more surgeons with expertise in AI may be needed to fully unlock the potential of AIS in a safe, efficacious and timely manner.
Collapse
Affiliation(s)
- Andrew A. Gumbs
- Centre Hospitalier Intercommunal de POISSY/SAINT-GERMAIN-EN-LAYE 10, Rue Champ de Gaillard, 78300 Poissy, France;
| | - Isabella Frigerio
- Department of Hepato-Pancreato-Biliary Surgery, Pederzoli Hospital, 37019 Peschiera del Garda, Italy;
| | - Gaya Spolverato
- Department of Surgical, Oncological and Gastroenterological Sciences, University of Padova, 35122 Padova, Italy;
| | - Roland Croner
- Department of General-, Visceral-, Vascular- and Transplantation Surgery, University of Magdeburg, Haus 60a, Leipziger Str. 44, 39120 Magdeburg, Germany;
| | - Alfredo Illanes
- INKA–Innovation Laboratory for Image Guided Therapy, Medical Faculty, Otto-von-Guericke University Magdeburg, 39120 Magdeburg, Germany;
| | - Elie Chouillard
- Centre Hospitalier Intercommunal de POISSY/SAINT-GERMAIN-EN-LAYE 10, Rue Champ de Gaillard, 78300 Poissy, France;
| | - Eyad Elyan
- School of Computing, Robert Gordon University, Aberdeen AB10 7JG, UK;
| |
Collapse
|
7
|
Preliminary study in the analysis of the severity of cardiac pathologies using the higher-order spectra on the heart-beats signals. POLISH JOURNAL OF MEDICAL PHYSICS AND ENGINEERING 2021. [DOI: 10.2478/pjmpe-2021-0010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Abstract
Phonocardiography is a technique for recording and interpreting the mechanical activity of the heart. The recordings generated by such a technique are called phonocardiograms (PCG). The PCG signals are acoustic waves revealing a wealth of clinical information about cardiac health. They enable doctors to better understand heart sounds when presented visually. Hence, multiple approaches have been proposed to analyze heart sounds based on PCG recordings. Due to the complexity and the high nonlinear nature of these signals, a computer-aided technique based on higher-order statistics (HOS) is employed, it is known to be an important tool since it takes into account the non-linearity of the PCG signals. This method also known as the bispectrum technique, can provide significant information to enhance the diagnosis for an accurate and objective interpretation of heart condition.
The objective expected by this paper is to test in a preliminary way the parameters which can make it possible to establish a discrimination between the various signals of different pathologies and to characterize the cardiac abnormalities.
This preliminary study will be done on a reduced sample (nine signals) before applying it subsequently to a larger sample. This work examines the effectiveness of using the bispectrum technique in the analysis of the pathological severity of different PCG signals. The presented approach showed that HOS technique has a good potential for pathological discrimination of various PCG signals.
Collapse
|
8
|
Pathological discrimination of the phonocardiogram signal using the bispectral technique. Phys Eng Sci Med 2020; 43:1371-1385. [PMID: 33165819 DOI: 10.1007/s13246-020-00943-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2020] [Accepted: 10/27/2020] [Indexed: 10/23/2022]
Abstract
Phonocardiography is a dynamic non-invasive and relatively low-cost technique used to monitor the state of the mechanical activity of the heart. The recordings generated by such a technique is called phonocardiogram (PCG) signals. When shown visually, PCG signals can provide more insights of heart sounds for medical doctors. Thus, several approaches have been proposed to analyse these sounds through PCG recordings. However, due to the complexity and the high nonlinear nature of these recordings, a computer-assisted technique based on higher-order statistics HOS is shown to be, among these techniques, an important tool in PCG signal processing. The third-order spectra technique is one of these techniques; known as bispectrum, it can provide significant information to support physicians with an accurate and objective interpretation of heart condition. This technique is implemented and discussed in this paper. The implemented technique is used for the analysis of heart severity on nine different PCG recordings. These are normal, innocent murmur, coarctation of the aorta, ejection click, atrial gallop, opening snap, aortic stenosis, drum rumble, and aortic regurgitation. A unique bispectrum representation is generated for each type of heart sounds signal. Then, based on the bispectrum analysis, fifteen higher-order spectra HOS features such as the bispectral amplitude, the entropies, the moments, and the weighted center are extracted from each PCG record. The obtained HOS-features showed a well-correlated evolution with the increasing importance of heart severity leading therefore to a high potential in discriminating pathological PCG signals. One should know that, generally, classification of pathological PCG signals refers to the distinction between the presence of a pathology from its absence (binary response) while the discrimination considered in this paper provides an analogue response (value) which can vary from one pathology to another in an increasing or decreasing way.
Collapse
|
9
|
Chen CH, Sühn T, Kalmar M, Maldonado I, Wex C, Croner R, Boese A, Friebe M, Illanes A. Texture differentiation using audio signal analysis with robotic interventional instruments. Comput Biol Med 2019; 112:103370. [PMID: 31374348 DOI: 10.1016/j.compbiomed.2019.103370] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2019] [Revised: 07/25/2019] [Accepted: 07/25/2019] [Indexed: 11/16/2022]
Abstract
Robotic minimally invasive surgery (RMIS) has played an important role in the last decades. In traditional surgery, surgeons rely on palpation using their hands. However, during RMIS, surgeons use the visual-haptics technique to compensate the missing sense of touch. Various sensors have been widely used to retrieve this natural sense, but there are still issues like integration, costs, sterilization and the small sensing area that prevent such approaches from being applied. A new method based on acoustic emission has been recently proposed for acquiring audio information from tool-tissue interaction during minimally invasive procedures that provide user guidance feedback. In this work the concept was adapted for acquiring audio information from a RMIS grasper and a first proof of concept is presented. Interactions of the grasper with various artificial and biological texture samples were recorded and analyzed using advanced signal processing and a clear correlation between audio spectral components and the tested texture were identified.
Collapse
Affiliation(s)
- C H Chen
- INKA Intelligente Katheter, Otto-von-Guericke University, Magdeburg, Germany.
| | - T Sühn
- INKA Intelligente Katheter, Otto-von-Guericke University, Magdeburg, Germany
| | - M Kalmar
- INKA Intelligente Katheter, Otto-von-Guericke University, Magdeburg, Germany
| | - I Maldonado
- INKA Intelligente Katheter, Otto-von-Guericke University, Magdeburg, Germany
| | - C Wex
- Clinic for General, Visceral, Vascular and Transplant Surgery, Otto-von-Guericke University, Magdeburg, Germany
| | - R Croner
- Clinic for General, Visceral, Vascular and Transplant Surgery, Otto-von-Guericke University, Magdeburg, Germany
| | - A Boese
- INKA Intelligente Katheter, Otto-von-Guericke University, Magdeburg, Germany
| | - M Friebe
- INKA Intelligente Katheter, Otto-von-Guericke University, Magdeburg, Germany
| | - A Illanes
- INKA Intelligente Katheter, Otto-von-Guericke University, Magdeburg, Germany
| |
Collapse
|