1
|
Ou Z, Bai J, Chen Z, Lu Y, Wang H, Long S, Chen G. RTSeg-net: A lightweight network for real-time segmentation of fetal head and pubic symphysis from intrapartum ultrasound images. Comput Biol Med 2024; 175:108501. [PMID: 38703545 DOI: 10.1016/j.compbiomed.2024.108501] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2024] [Revised: 03/19/2024] [Accepted: 04/21/2024] [Indexed: 05/06/2024]
Abstract
The segmentation of the fetal head (FH) and pubic symphysis (PS) from intrapartum ultrasound images plays a pivotal role in monitoring labor progression and informing crucial clinical decisions. Achieving real-time segmentation with high accuracy on systems with limited hardware capabilities presents significant challenges. To address these challenges, we propose the real-time segmentation network (RTSeg-Net), a groundbreaking lightweight deep learning model that incorporates innovative distribution shifting convolutional blocks, tokenized multilayer perceptron blocks, and efficient feature fusion blocks. Designed for optimal computational efficiency, RTSeg-Net minimizes resource demand while significantly enhancing segmentation performance. Our comprehensive evaluation on two distinct intrapartum ultrasound image datasets reveals that RTSeg-Net achieves segmentation accuracy on par with more complex state-of-the-art networks, utilizing merely 1.86 M parameters-just 6 % of their hyperparameters-and operating seven times faster, achieving a remarkable rate of 31.13 frames per second on a Jetson Nano, a device known for its limited computing capacity. These achievements underscore RTSeg-Net's potential to provide accurate, real-time segmentation on low-power devices, broadening the scope for its application across various stages of labor. By facilitating real-time, accurate ultrasound image analysis on portable, low-cost devices, RTSeg-Net promises to revolutionize intrapartum monitoring, making sophisticated diagnostic tools accessible to a wider range of healthcare settings.
Collapse
Affiliation(s)
- Zhanhong Ou
- Guangdong Provincial Key Laboratory of Traditional Chinese Medicine Information Technology, Jinan University, Guangzhou, 510632, China
| | - Jieyun Bai
- Guangdong Provincial Key Laboratory of Traditional Chinese Medicine Information Technology, Jinan University, Guangzhou, 510632, China; Auckland Bioengineering Institute, University of Auckland, Auckland, 1010, New Zealand.
| | - Zhide Chen
- Guangdong Provincial Key Laboratory of Traditional Chinese Medicine Information Technology, Jinan University, Guangzhou, 510632, China
| | - Yaosheng Lu
- Guangdong Provincial Key Laboratory of Traditional Chinese Medicine Information Technology, Jinan University, Guangzhou, 510632, China
| | - Huijin Wang
- Guangdong Provincial Key Laboratory of Traditional Chinese Medicine Information Technology, Jinan University, Guangzhou, 510632, China
| | - Shun Long
- Guangdong Provincial Key Laboratory of Traditional Chinese Medicine Information Technology, Jinan University, Guangzhou, 510632, China
| | - Gaowen Chen
- Obstetrics and Gynecology Center, Zhujiang Hospital, Southern Medical University, Guangzhou, 510280, China
| |
Collapse
|
2
|
Dick K, Humber J, Ducharme R, Dingwall-Harvey A, Armour CM, Hawken S, Walker MC. The Transformative Potential of AI in Obstetrics and Gynaecology. JOURNAL OF OBSTETRICS AND GYNAECOLOGY CANADA 2024; 46:102277. [PMID: 37951574 DOI: 10.1016/j.jogc.2023.102277] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Revised: 10/24/2023] [Accepted: 10/25/2023] [Indexed: 11/14/2023]
Abstract
The transformative power of artificial intelligence (AI) is reshaping diverse domains of medicine. Recent progress, catalyzed by computing advancements, has seen commensurate adoption of AI technologies within obstetrics and gynaecology. We explore the use and potential of AI in three focus areas: predictive modelling for pregnancy complications, Deep learning-based image interpretation for precise diagnoses, and large language models enabling intelligent health care assistants. We also provide recommendations for the ethical implementation, governance of AI, and promote research into AI explainability, which are crucial for responsible AI integration and deployment. AI promises a revolutionary era of personalized health care in obstetrics and gynaecology.
Collapse
Affiliation(s)
- Kevin Dick
- Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON
| | - James Humber
- Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON
| | - Robin Ducharme
- Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON
| | - Alysha Dingwall-Harvey
- Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON; Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON
| | - Christine M Armour
- Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON; Department of Pediatrics, University of Ottawa, Ottawa, ON; Prenatal Screening Ontario, Better Outcomes Registry and Network, Ottawa, ON
| | - Steven Hawken
- Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON; Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON; School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON; ICES, Toronto, ON
| | - Mark C Walker
- Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON; Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON; School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON; ICES, Toronto, ON; Department of Obstetrics and Gynecology, University of Ottawa, Ottawa, ON; International and Global Health Office, University of Ottawa, Ottawa, ON; BORN Ontario, Children's Hospital of Eastern Ontario, Ottawa, ON; Department of Obstetrics, Gynecology and Newborn Care, The Ottawa Hospital, Ottawa, ON.
| |
Collapse
|
3
|
Enache IA, Iovoaica-Rămescu C, Ciobanu ȘG, Berbecaru EIA, Vochin A, Băluță ID, Istrate-Ofițeru AM, Comănescu CM, Nagy RD, Iliescu DG. Artificial Intelligence in Obstetric Anomaly Scan: Heart and Brain. Life (Basel) 2024; 14:166. [PMID: 38398675 PMCID: PMC10890185 DOI: 10.3390/life14020166] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2023] [Revised: 12/28/2023] [Accepted: 01/20/2024] [Indexed: 02/25/2024] Open
Abstract
BACKGROUND The ultrasound scan represents the first tool that obstetricians use in fetal evaluation, but sometimes, it can be limited by mobility or fetal position, excessive thickness of the maternal abdominal wall, or the presence of post-surgical scars on the maternal abdominal wall. Artificial intelligence (AI) has already been effectively used to measure biometric parameters, automatically recognize standard planes of fetal ultrasound evaluation, and for disease diagnosis, which helps conventional imaging methods. The usage of information, ultrasound scan images, and a machine learning program create an algorithm capable of assisting healthcare providers by reducing the workload, reducing the duration of the examination, and increasing the correct diagnosis capability. The recent remarkable expansion in the use of electronic medical records and diagnostic imaging coincides with the enormous success of machine learning algorithms in image identification tasks. OBJECTIVES We aim to review the most relevant studies based on deep learning in ultrasound anomaly scan evaluation of the most complex fetal systems (heart and brain), which enclose the most frequent anomalies.
Collapse
Affiliation(s)
- Iuliana-Alina Enache
- Doctoral School, University of Medicine and Pharmacy of Craiova, 200349 Craiova, Romania; (I.-A.E.); (C.I.-R.); (E.I.A.B.)
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
| | - Cătălina Iovoaica-Rămescu
- Doctoral School, University of Medicine and Pharmacy of Craiova, 200349 Craiova, Romania; (I.-A.E.); (C.I.-R.); (E.I.A.B.)
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
| | - Ștefan Gabriel Ciobanu
- Doctoral School, University of Medicine and Pharmacy of Craiova, 200349 Craiova, Romania; (I.-A.E.); (C.I.-R.); (E.I.A.B.)
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
| | - Elena Iuliana Anamaria Berbecaru
- Doctoral School, University of Medicine and Pharmacy of Craiova, 200349 Craiova, Romania; (I.-A.E.); (C.I.-R.); (E.I.A.B.)
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
| | - Andreea Vochin
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
| | - Ionuț Daniel Băluță
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
| | - Anca Maria Istrate-Ofițeru
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
- Ginecho Clinic, Medgin SRL, 200333 Craiova, Romania
- Research Centre for Microscopic Morphology and Immunology, University of Medicine and Pharmacy of Craiova, 200642 Craiova, Romania
| | - Cristina Maria Comănescu
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
- Ginecho Clinic, Medgin SRL, 200333 Craiova, Romania
- Department of Anatomy, University of Medicine and Pharmacy of Craiova, 200349 Craiova, Romania
| | - Rodica Daniela Nagy
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
- Ginecho Clinic, Medgin SRL, 200333 Craiova, Romania
| | - Dominic Gabriel Iliescu
- Department of Obstetrics and Gynecology, University Emergency County Hospital, 200642 Craiova, Romania; (A.V.); (I.D.B.); (A.M.I.-O.); (C.M.C.); (R.D.N.); (D.G.I.)
- Ginecho Clinic, Medgin SRL, 200333 Craiova, Romania
- Department of Obstetrics and Gynecology, University of Medicine and Pharmacy of Craiova, 200349 Craiova, Romania
| |
Collapse
|
4
|
Rauf F, Khan MA, Bashir AK, Jabeen K, Hamza A, Alzahrani AI, Alalwan N, Masood A. Automated deep bottleneck residual 82-layered architecture with Bayesian optimization for the classification of brain and common maternal fetal ultrasound planes. Front Med (Lausanne) 2023; 10:1330218. [PMID: 38188327 PMCID: PMC10769562 DOI: 10.3389/fmed.2023.1330218] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2023] [Accepted: 12/07/2023] [Indexed: 01/09/2024] Open
Abstract
Despite a worldwide decline in maternal mortality over the past two decades, a significant gap persists between low- and high-income countries, with 94% of maternal mortality concentrated in low and middle-income nations. Ultrasound serves as a prevalent diagnostic tool in prenatal care for monitoring fetal growth and development. Nevertheless, acquiring standard fetal ultrasound planes with accurate anatomical structures proves challenging and time-intensive, even for skilled sonographers. Therefore, for determining common maternal fetuses from ultrasound images, an automated computer-aided diagnostic (CAD) system is required. A new residual bottleneck mechanism-based deep learning architecture has been proposed that includes 82 layers deep. The proposed architecture has added three residual blocks, each including two highway paths and one skip connection. In addition, a convolutional layer has been added of size 3 × 3 before each residual block. In the training process, several hyper parameters have been initialized using Bayesian optimization (BO) rather than manual initialization. Deep features are extracted from the average pooling layer and performed the classification. In the classification process, an increase occurred in the computational time; therefore, we proposed an improved search-based moth flame optimization algorithm for optimal feature selection. The data is then classified using neural network classifiers based on the selected features. The experimental phase involved the analysis of ultrasound images, specifically focusing on fetal brain and common maternal fetal images. The proposed method achieved 78.5% and 79.4% accuracy for brain fetal planes and common maternal fetal planes. Comparison with several pre-trained neural nets and state-of-the-art (SOTA) optimization algorithms shows improved accuracy.
Collapse
Affiliation(s)
- Fatima Rauf
- Department of Computer Science, HITEC University, Taxila, Pakistan
| | | | - Ali Kashif Bashir
- Department of Computing and Mathematics, Manchester Metropolitan University, Manchester, United Kingdom
| | - Kiran Jabeen
- Department of Computer Science, HITEC University, Taxila, Pakistan
| | - Ameer Hamza
- Department of Computer Science, HITEC University, Taxila, Pakistan
| | | | - Nasser Alalwan
- Computer Science Department, Community College, King Saud University, Riyadh, Saudi Arabia
| | - Anum Masood
- Department of Circulation and Medical Imaging, Norwegian University of Science and Technology, Trondheim, Norway
- Institute of Neurosciences and Medicine (INM), Forschungszentrum Jülich, Jülich, Germany
| |
Collapse
|
5
|
Seman NM, Adem HM, Disasa FA, Simegn GL. Development of birth weight estimation model for Ethiopian population from sonographic evaluation. BMC Pregnancy Childbirth 2023; 23:850. [PMID: 38082249 PMCID: PMC10714654 DOI: 10.1186/s12884-023-06145-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 11/20/2023] [Indexed: 12/18/2023] Open
Abstract
BACKGROUND Fetal birth weight (FBW) estimation involves predicting the weight of a fetus prior to delivery. This prediction serves as a crucial input for ensuring effective, accurate, and appropriate obstetric planning, management, and decision-making. Typically, there are two methods used to estimate FBW: the clinical method (which involves measuring fundal height and performing abdominal palpation) or sonographic evaluation. The accuracy of clinical method estimation relies heavily on the experience of the clinician. Sonographic evaluation involves utilizing various mathematical models to estimate FBW, primarily relying on fetal biometry. However, these models often demonstrate estimation errors that exceed acceptable levels, which can result in inadequate labor and delivery management planning. One source of this estimation error is sociodemographic variations between population groups in different countries. Additionally, inter- and intra-observer variability during fetal biometry measurement also contributes to errors in FBW estimation. METHODS In this research, a novel mathematical model was proposed through multiple regression analysis to predict FBW with an accepted level of estimation error. To develop the model, population data consisting of fetal biometry, fetal ultrasound images, obstetric variables, and maternal sociodemographic factors (age, marital status, ethnicity, educational status, occupational status, income, etc.) of the mother were collected. Two approaches were used to develop the mathematical model. The first method was based on fetal biometry data measured by a physician and the second used fetal biometry data measured using an image processing algorithm. The image processing algorithm comprises preprocessing, segmentation, feature extraction, and fetal biometry measurement. RESULTS The model developed using the two approaches were tested to assess their performance in estimating FBW, and they achieved mean percentage errors of 7.53% and 5.89%, respectively. Based on these results, the second model was chosen as the final model. CONCLUSION The findings indicate that the developed model can estimate FBW with an acceptable level of error for the Ethiopian population. Furthermore, this model outperforms existing models for FBW estimation. The proposed approach has the potential to reduce infant and maternal mortality rates by providing accurate fetal birth weight estimates for informed obstetric planning.
Collapse
Affiliation(s)
- Nejat Mohammed Seman
- Biomedical Imaging Unit, School of Biomedical Engineering, Jimma Institute of Technology Jimma University, Jimma, Ethiopia
| | - Hamdia Murad Adem
- Biomedical Imaging Unit, School of Biomedical Engineering, Jimma Institute of Technology Jimma University, Jimma, Ethiopia
| | - Fanta Assefa Disasa
- Department of Obstetrics and Gynecology, Jimma Institute of Health Sciences, Jimma University, Jimma, Ethiopia
| | - Gizeaddis Lamesgin Simegn
- Biomedical Imaging Unit, School of Biomedical Engineering, Jimma Institute of Technology Jimma University, Jimma, Ethiopia.
| |
Collapse
|
6
|
Płotka S, Grzeszczyk MK, Brawura-Biskupski-Samaha R, Gutaj P, Lipa M, Trzciński T, Išgum I, Sánchez CI, Sitek A. BabyNet++: Fetal birth weight prediction using biometry multimodal data acquired less than 24 hours before delivery. Comput Biol Med 2023; 167:107602. [PMID: 37925906 DOI: 10.1016/j.compbiomed.2023.107602] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2023] [Revised: 09/12/2023] [Accepted: 10/17/2023] [Indexed: 11/07/2023]
Abstract
Accurate prediction of fetal weight at birth is essential for effective perinatal care, particularly in the context of antenatal management, which involves determining the timing and mode of delivery. The current standard of care involves performing a prenatal ultrasound 24 hours prior to delivery. However, this task presents challenges as it requires acquiring high-quality images, which becomes difficult during advanced pregnancy due to the lack of amniotic fluid. In this paper, we present a novel method that automatically predicts fetal birth weight by using fetal ultrasound video scans and clinical data. Our proposed method is based on a Transformer-based approach that combines a Residual Transformer Module with a Dynamic Affine Feature Map Transform. This method leverages tabular clinical data to evaluate 2D+t spatio-temporal features in fetal ultrasound video scans. Development and evaluation were carried out on a clinical set comprising 582 2D fetal ultrasound videos and clinical records of pregnancies from 194 patients performed less than 24 hours before delivery. Our results show that our method outperforms several state-of-the-art automatic methods and estimates fetal birth weight with an accuracy comparable to human experts. Hence, automatic measurements obtained by our method can reduce the risk of errors inherent in manual measurements. Observer studies suggest that our approach may be used as an aid for less experienced clinicians to predict fetal birth weight before delivery, optimizing perinatal care regardless of the available expertise.
Collapse
Affiliation(s)
- Szymon Płotka
- Sano Centre for Computational Medicine, Cracow, Poland; Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands; Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, Amsterdam, The Netherlands.
| | | | | | - Paweł Gutaj
- Department of Reproduction, Poznan University of Medical Sciences, Poznan, Poznan, Poland
| | - Michał Lipa
- First Department of Obstetrics and Gynecology, Medical University of Warsaw, Warsaw, Poland
| | - Tomasz Trzciński
- Institute of Computer Science, Warsaw University of Technology, Warsaw, Poland
| | - Ivana Išgum
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands; Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, Amsterdam, The Netherlands; Department of Radiology and Nuclear Medicine, Amsterdam University Medical Center, location University of Amsterdam, Amsterdam, The Netherlands
| | - Clara I Sánchez
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands; Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, Amsterdam, The Netherlands
| | - Arkadiusz Sitek
- Center for Advanced Medical Computing and Simulation, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
7
|
Płotka SS, Grzeszczyk MK, Szenejko PI, Żebrowska K, Szymecka-Samaha NA, Łęgowik T, Lipa MA, Kosińska-Kaczyńska K, Brawura-Biskupski-Samaha R, Išgum I, Sánchez CI, Sitek A. Deep learning for estimation of fetal weight throughout the pregnancy from fetal abdominal ultrasound. Am J Obstet Gynecol MFM 2023; 5:101182. [PMID: 37821009 DOI: 10.1016/j.ajogmf.2023.101182] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Revised: 09/17/2023] [Accepted: 10/04/2023] [Indexed: 10/13/2023]
Abstract
BACKGROUND Fetal weight is currently estimated from fetal biometry parameters using heuristic mathematical formulas. Fetal biometry requires measurements of the fetal head, abdomen, and femur. However, this examination is prone to inter- and intraobserver variability because of factors, such as the experience of the operator, image quality, maternal characteristics, or fetal movements. Our study tested the hypothesis that a deep learning method can estimate fetal weight based on a video scan of the fetal abdomen and gestational age with similar performance to the full biometry-based estimations provided by clinical experts. OBJECTIVE This study aimed to develop and test a deep learning method to automatically estimate fetal weight from fetal abdominal ultrasound video scans. STUDY DESIGN A dataset of 900 routine fetal ultrasound examinations was used. Among those examinations, 800 retrospective ultrasound video scans of the fetal abdomen from 700 pregnant women between 15 6/7 and 41 0/7 weeks of gestation were used to train the deep learning model. After the training phase, the model was evaluated on an external prospectively acquired test set of 100 scans from 100 pregnant women between 16 2/7 and 38 0/7 weeks of gestation. The deep learning model was trained to directly estimate fetal weight from ultrasound video scans of the fetal abdomen. The deep learning estimations were compared with manual measurements on the test set made by 6 human readers with varying levels of expertise. Human readers used standard 3 measurements made on the standard planes of the head, abdomen, and femur and heuristic formula to estimate fetal weight. The Bland-Altman analysis, mean absolute percentage error, and intraclass correlation coefficient were used to evaluate the performance and robustness of the deep learning method and were compared with human readers. RESULTS Bland-Altman analysis did not show systematic deviations between readers and deep learning. The mean and standard deviation of the mean absolute percentage error between 6 human readers and the deep learning approach was 3.75%±2.00%. Excluding junior readers (residents), the mean absolute percentage error between 4 experts and the deep learning approach was 2.59%±1.11%. The intraclass correlation coefficients reflected excellent reliability and varied between 0.9761 and 0.9865. CONCLUSION This study reports the use of deep learning to estimate fetal weight using only ultrasound video of the fetal abdomen from fetal biometry scans. Our experiments demonstrated similar performance of human measurements and deep learning on prospectively acquired test data. Deep learning is a promising approach to directly estimate fetal weight using ultrasound video scans of the fetal abdomen.
Collapse
Affiliation(s)
- Szymon S Płotka
- Sano Centre for Computational Medicine, Cracow, Poland (Messrs Płotka and Grzeszczyk); Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez)
| | - Michal K Grzeszczyk
- Sano Centre for Computational Medicine, Cracow, Poland (Messrs Płotka and Grzeszczyk)
| | - Paula I Szenejko
- First Department of Obstetrics and Gynecology, Medical University of Warsaw, Warsaw, Poland (Drs Szenejko and Lipa); Doctoral School of Translational Medicine, Centre of Postgraduate Medical Education, Warsaw, Poland (Dr Szenejko)
| | - Kinga Żebrowska
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | - Natalia A Szymecka-Samaha
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | | | - Michał A Lipa
- First Department of Obstetrics and Gynecology, Medical University of Warsaw, Warsaw, Poland (Drs Szenejko and Lipa)
| | - Katarzyna Kosińska-Kaczyńska
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | - Robert Brawura-Biskupski-Samaha
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | - Ivana Išgum
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Radiology and Nuclear Medicine, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Dr Išgum)
| | - Clara I Sánchez
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez)
| | - Arkadiusz Sitek
- Center for Advanced Medical Computing and Simulation, Massachusetts General Hospital, Harvard Medical School, Boston, MA (Dr Sitek).
| |
Collapse
|
8
|
Slimani S, Hounka S, Mahmoudi A, Rehah T, Laoudiyi D, Saadi H, Bouziyane A, Lamrissi A, Jalal M, Bouhya S, Akiki M, Bouyakhf Y, Badaoui B, Radgui A, Mhlanga M, Bouyakhf EH. Fetal biometry and amniotic fluid volume assessment end-to-end automation using Deep Learning. Nat Commun 2023; 14:7047. [PMID: 37923713 PMCID: PMC10624828 DOI: 10.1038/s41467-023-42438-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2023] [Accepted: 10/10/2023] [Indexed: 11/06/2023] Open
Abstract
Fetal biometry and amniotic fluid volume assessments are two essential yet repetitive tasks in fetal ultrasound screening scans, aiding in the detection of potentially life-threatening conditions. However, these assessment methods can occasionally yield unreliable results. Advances in deep learning have opened up new avenues for automated measurements in fetal ultrasound, demonstrating human-level performance in various fetal ultrasound tasks. Nevertheless, the majority of these studies are retrospective in silico studies, with a limited number including African patients in their datasets. In this study we developed and prospectively assessed the performance of deep learning models for end-to-end automation of fetal biometry and amniotic fluid volume measurements. These models were trained using a newly constructed database of 172,293 de-identified Moroccan fetal ultrasound images, supplemented with publicly available datasets. the models were then tested on prospectively acquired video clips from 172 pregnant people forming a consecutive series gathered at four healthcare centers in Morocco. Our results demonstrate that the 95% limits of agreement between the models and practitioners for the studied measurements were narrower than the reported intra- and inter-observer variability among expert human sonographers for all the parameters under study. This means that these models could be deployed in clinical conditions, to alleviate time-consuming, repetitive tasks, and make fetal ultrasound more accessible in limited-resource environments.
Collapse
Affiliation(s)
- Saad Slimani
- Deepecho, 10106, Rabat, Morocco.
- Ibn Rochd University Hospital, Hassan II University, 20100, Casablanca, Morocco.
| | - Salaheddine Hounka
- Telecommunications Systems Services and Networks lab (STRS Lab), INPT, 10112, Rabat, Morocco
| | - Abdelhak Mahmoudi
- Deepecho, 10106, Rabat, Morocco
- Ecole Normale Supérieure, LIMIARF, Mohammed V University in Rabat, 4014, Rabat, Morocco
| | | | - Dalal Laoudiyi
- Ibn Rochd University Hospital, Hassan II University, 20100, Casablanca, Morocco
| | - Hanane Saadi
- Mohammed VI University Hospital, 60049, Oujda, Morocco
| | - Amal Bouziyane
- Université Mohammed VI des Sciences de la Santé, Hôpital Universitaire Cheikh Khalifa, 82403, Casablanca, Morocco
| | - Amine Lamrissi
- Ibn Rochd University Hospital, Hassan II University, 20100, Casablanca, Morocco
| | - Mohamed Jalal
- Ibn Rochd University Hospital, Hassan II University, 20100, Casablanca, Morocco
| | - Said Bouhya
- Ibn Rochd University Hospital, Hassan II University, 20100, Casablanca, Morocco
| | | | | | - Bouabid Badaoui
- Laboratory of Biodiversity, Ecology, and Genome, Department of Biology, Faculty of Sciences, Mohammed V University in Rabat, 1014, Rabat, Morocco
- African Sustainable Agriculture Research Institute (ASARI), Mohammed VI Polytechnic University (UM6P), 43150, Laâyoune, Morocco
| | - Amina Radgui
- Telecommunications Systems Services and Networks lab (STRS Lab), INPT, 10112, Rabat, Morocco
| | - Musa Mhlanga
- Radboud Institute for Molecular Life Sciences, Epigenomics & Single Cell Biophysics, 6525 XZ, Nijmegen, the Netherlands
| | | |
Collapse
|
9
|
Jost E, Kosian P, Jimenez Cruz J, Albarqouni S, Gembruch U, Strizek B, Recker F. Evolving the Era of 5D Ultrasound? A Systematic Literature Review on the Applications for Artificial Intelligence Ultrasound Imaging in Obstetrics and Gynecology. J Clin Med 2023; 12:6833. [PMID: 37959298 PMCID: PMC10649694 DOI: 10.3390/jcm12216833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 10/17/2023] [Accepted: 10/25/2023] [Indexed: 11/15/2023] Open
Abstract
Artificial intelligence (AI) has gained prominence in medical imaging, particularly in obstetrics and gynecology (OB/GYN), where ultrasound (US) is the preferred method. It is considered cost effective and easily accessible but is time consuming and hindered by the need for specialized training. To overcome these limitations, AI models have been proposed for automated plane acquisition, anatomical measurements, and pathology detection. This study aims to overview recent literature on AI applications in OB/GYN US imaging, highlighting their benefits and limitations. For the methodology, a systematic literature search was performed in the PubMed and Cochrane Library databases. Matching abstracts were screened based on the PICOS (Participants, Intervention or Exposure, Comparison, Outcome, Study type) scheme. Articles with full text copies were distributed to the sections of OB/GYN and their research topics. As a result, this review includes 189 articles published from 1994 to 2023. Among these, 148 focus on obstetrics and 41 on gynecology. AI-assisted US applications span fetal biometry, echocardiography, or neurosonography, as well as the identification of adnexal and breast masses, and assessment of the endometrium and pelvic floor. To conclude, the applications for AI-assisted US in OB/GYN are abundant, especially in the subspecialty of obstetrics. However, while most studies focus on common application fields such as fetal biometry, this review outlines emerging and still experimental fields to promote further research.
Collapse
Affiliation(s)
- Elena Jost
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Philipp Kosian
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Jorge Jimenez Cruz
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Shadi Albarqouni
- Department of Diagnostic and Interventional Radiology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
- Helmholtz AI, Helmholtz Munich, Ingolstädter Landstraße 1, 85764 Neuherberg, Germany
| | - Ulrich Gembruch
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Brigitte Strizek
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Florian Recker
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| |
Collapse
|
10
|
Gao Z, Tian Z, Pu B, Li S, Li K. Deep endpoints focusing network under geometric constraints for end-to-end biometric measurement in fetal ultrasound images. Comput Biol Med 2023; 165:107399. [PMID: 37683530 DOI: 10.1016/j.compbiomed.2023.107399] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Revised: 07/25/2023] [Accepted: 08/26/2023] [Indexed: 09/10/2023]
Abstract
Biometric measurements in fetal ultrasound images are one of the most highly demanding medical image analysis tasks that can directly contribute to diagnosing fetal diseases. However, the natural high-speckle noise and shadows in ultrasound data present big challenges for automatic biometric measurement. Almost all the existing dominant automatic methods are two-stage models, where the key anatomical structures are segmented first and then measured, thus bringing segmentation and fitting errors. What is worse, the results of the second-stage fitting are completely dependent on the good performance of first-stage segmentation, i.e., the segmentation error will lead to a larger fitting error. To this end, we propose a novel end-to-end biometric measurement network, abbreviated as E2EBM-Net, that directly fits the measurement parameters. E2EBM-Net includes a cross-level feature fusion module to extract multi-scale texture information, a hard-soft attention module to improve position sensitivity, and center-focused detectors jointly to achieve accurate localizing and regressing of the measurement endpoints, as well as a loss function with geometric cues to enhance the correlations. To our knowledge, this is the first AI-based application to address the biometric measurement of irregular anatomical structures in fetal ultrasound images with an end-to-end approach. Experiment results showed that E2EBM-Net outperformed the existing methods and achieved the state-of-the-art performance.
Collapse
Affiliation(s)
- Zhan Gao
- College of Computer Science and Electronic Engineering, Hunan University, Changsha 410000, China
| | - Zean Tian
- College of Computer Science and Electronic Engineering, Hunan University, Changsha 410000, China
| | - Bin Pu
- College of Computer Science and Electronic Engineering, Hunan University, Changsha 410000, China
| | - Shengli Li
- Department of Ultrasound, Shenzhen Maternal & Child Healthcare Hospital, Southern Medical University, Shenzhen, 518028, China
| | - Kenli Li
- College of Computer Science and Electronic Engineering, Hunan University, Changsha 410000, China.
| |
Collapse
|
11
|
Pietrolucci ME, Maqina P, Mappa I, Marra MC, D' Antonio F, Rizzo G. Evaluation of an artificial intelligent algorithm (Heartassist™) to automatically assess the quality of second trimester cardiac views: a prospective study. J Perinat Med 2023; 51:920-924. [PMID: 37097825 DOI: 10.1515/jpm-2023-0052] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/05/2023] [Accepted: 03/25/2023] [Indexed: 04/26/2023]
Abstract
OBJECTIVES The aim of this study was to evaluate the agreement between visual and automatic methods in assessing the adequacy of fetal cardiac views obtained during second trimester ultrasonographic examination. METHODS In a prospective observational study frames of the four-chamber view left and right outflow tracts, and three-vessel trachea view were obtained from 120 consecutive singleton low-risk women undergoing second trimester ultrasound at 19-23 weeks of gestation. For each frame, the quality assessment was performed by an expert sonographer and by an artificial intelligence software (Heartassist™). The Cohen's κ coefficient was used to evaluate the agreement rates between both techniques. RESULTS The number and percentage of images considered adequate visually by the expert or with Heartassist™ were similar with a percentage >87 % for all the cardiac views considered. The Cohen's κ coefficient values were for the four-chamber view 0.827 (95 % CI 0.662-0.992), 0.814 (95 % CI 0.638-0.990) for left ventricle outflow tract, 0.838 (95 % CI 0.683-0.992) and three vessel trachea view 0.866 (95 % CI 0.717-0.999), indicating a good agreement between the two techniques. CONCLUSIONS Heartassist™ allows to obtain the automatic evaluation of fetal cardiac views, reached the same accuracy of expert visual assessment and has the potential to be applied in the evaluation of fetal heart during second trimester ultrasonographic screening of fetal anomalies.
Collapse
Affiliation(s)
- Maria Elena Pietrolucci
- Department of Obstetrics and Gynecology, Fondazione Policlinico Tor Vergata, Università di Roma Tor Vergata, Roma, Italy
| | - Pavjola Maqina
- Department of Obstetrics and Gynecology, Fondazione Policlinico Tor Vergata, Università di Roma Tor Vergata, Roma, Italy
| | - Ilenia Mappa
- Department of Obstetrics and Gynecology, Fondazione Policlinico Tor Vergata, Università di Roma Tor Vergata, Roma, Italy
| | - Maria Chiara Marra
- Department of Obstetrics and Gynecology, Fondazione Policlinico Tor Vergata, Università di Roma Tor Vergata, Roma, Italy
| | | | - Giuseppe Rizzo
- Department of Obstetrics and Gynecology, Fondazione Policlinico Tor Vergata, Università di Roma Tor Vergata, Roma, Italy
| |
Collapse
|
12
|
Ramirez Zegarra R, Ghi T. Use of artificial intelligence and deep learning in fetal ultrasound imaging. ULTRASOUND IN OBSTETRICS & GYNECOLOGY : THE OFFICIAL JOURNAL OF THE INTERNATIONAL SOCIETY OF ULTRASOUND IN OBSTETRICS AND GYNECOLOGY 2023; 62:185-194. [PMID: 36436205 DOI: 10.1002/uog.26130] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Revised: 11/06/2022] [Accepted: 11/21/2022] [Indexed: 06/16/2023]
Abstract
Deep learning is considered the leading artificial intelligence tool in image analysis in general. Deep-learning algorithms excel at image recognition, which makes them valuable in medical imaging. Obstetric ultrasound has become the gold standard imaging modality for detection and diagnosis of fetal malformations. However, ultrasound relies heavily on the operator's experience, making it unreliable in inexperienced hands. Several studies have proposed the use of deep-learning models as a tool to support sonographers, in an attempt to overcome these problems inherent to ultrasound. Deep learning has many clinical applications in the field of fetal imaging, including identification of normal and abnormal fetal anatomy and measurement of fetal biometry. In this Review, we provide a comprehensive explanation of the fundamentals of deep learning in fetal imaging, with particular focus on its clinical applicability. © 2022 International Society of Ultrasound in Obstetrics and Gynecology.
Collapse
Affiliation(s)
- R Ramirez Zegarra
- Department of Medicine and Surgery, Obstetrics and Gynecology Unit, University of Parma, Parma, Italy
| | - T Ghi
- Department of Medicine and Surgery, Obstetrics and Gynecology Unit, University of Parma, Parma, Italy
| |
Collapse
|
13
|
Lee C, Willis A, Chen C, Sieniek M, Watters A, Stetson B, Uddin A, Wong J, Pilgrim R, Chou K, Tse D, Shetty S, Gomes RG. Development of a Machine Learning Model for Sonographic Assessment of Gestational Age. JAMA Netw Open 2023; 6:e2248685. [PMID: 36598790 PMCID: PMC9857195 DOI: 10.1001/jamanetworkopen.2022.48685] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
IMPORTANCE Fetal ultrasonography is essential for confirmation of gestational age (GA), and accurate GA assessment is important for providing appropriate care throughout pregnancy and for identifying complications, including fetal growth disorders. Derivation of GA from manual fetal biometry measurements (ie, head, abdomen, and femur) is operator dependent and time-consuming. OBJECTIVE To develop artificial intelligence (AI) models to estimate GA with higher accuracy and reliability, leveraging standard biometry images and fly-to ultrasonography videos. DESIGN, SETTING, AND PARTICIPANTS To improve GA estimates, this diagnostic study used AI to interpret standard plane ultrasonography images and fly-to ultrasonography videos, which are 5- to 10-second videos that can be automatically recorded as part of the standard of care before the still image is captured. Three AI models were developed and validated: (1) an image model using standard plane images, (2) a video model using fly-to videos, and (3) an ensemble model (combining both image and video models). The models were trained and evaluated on data from the Fetal Age Machine Learning Initiative (FAMLI) cohort, which included participants from 2 study sites at Chapel Hill, North Carolina (US), and Lusaka, Zambia. Participants were eligible to be part of this study if they received routine antenatal care at 1 of these sites, were aged 18 years or older, had a viable intrauterine singleton pregnancy, and could provide written consent. They were not eligible if they had known uterine or fetal abnormality, or had any other conditions that would make participation unsafe or complicate interpretation. Data analysis was performed from January to July 2022. MAIN OUTCOMES AND MEASURES The primary analysis outcome for GA was the mean difference in absolute error between the GA model estimate and the clinical standard estimate, with the ground truth GA extrapolated from the initial GA estimated at an initial examination. RESULTS Of the total cohort of 3842 participants, data were calculated for a test set of 404 participants with a mean (SD) age of 28.8 (5.6) years at enrollment. All models were statistically superior to standard fetal biometry-based GA estimates derived from images captured by expert sonographers. The ensemble model had the lowest mean absolute error compared with the clinical standard fetal biometry (mean [SD] difference, -1.51 [3.96] days; 95% CI, -1.90 to -1.10 days). All 3 models outperformed standard biometry by a more substantial margin on fetuses that were predicted to be small for their GA. CONCLUSIONS AND RELEVANCE These findings suggest that AI models have the potential to empower trained operators to estimate GA with higher accuracy.
Collapse
Affiliation(s)
- Chace Lee
- Google Health, Palo Alto, California
| | | | | | | | - Amber Watters
- Department of Obstetrics and Gynecology, Northwestern University Feinberg School of Medicine, Chicago, Illinois
| | - Bethany Stetson
- Department of Obstetrics and Gynecology, Northwestern University Feinberg School of Medicine, Chicago, Illinois
| | | | | | | | | | | | | | | |
Collapse
|