1
|
Zhang Z, Zhou X, Fang Y, Xiong Z, Zhang T. AI-driven 3D bioprinting for regenerative medicine: From bench to bedside. Bioact Mater 2025; 45:201-230. [PMID: 39651398 PMCID: PMC11625302 DOI: 10.1016/j.bioactmat.2024.11.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2024] [Revised: 11/01/2024] [Accepted: 11/16/2024] [Indexed: 12/11/2024] Open
Abstract
In recent decades, 3D bioprinting has garnered significant research attention due to its ability to manipulate biomaterials and cells to create complex structures precisely. However, due to technological and cost constraints, the clinical translation of 3D bioprinted products (BPPs) from bench to bedside has been hindered by challenges in terms of personalization of design and scaling up of production. Recently, the emerging applications of artificial intelligence (AI) technologies have significantly improved the performance of 3D bioprinting. However, the existing literature remains deficient in a methodological exploration of AI technologies' potential to overcome these challenges in advancing 3D bioprinting toward clinical application. This paper aims to present a systematic methodology for AI-driven 3D bioprinting, structured within the theoretical framework of Quality by Design (QbD). This paper commences by introducing the QbD theory into 3D bioprinting, followed by summarizing the technology roadmap of AI integration in 3D bioprinting, including multi-scale and multi-modal sensing, data-driven design, and in-line process control. This paper further describes specific AI applications in 3D bioprinting's key elements, including bioink formulation, model structure, printing process, and function regulation. Finally, the paper discusses current prospects and challenges associated with AI technologies to further advance the clinical translation of 3D bioprinting.
Collapse
Affiliation(s)
- Zhenrui Zhang
- Biomanufacturing Center, Department of Mechanical Engineering, Tsinghua University, Beijing, 100084, PR China
- Biomanufacturing and Rapid Forming Technology Key Laboratory of Beijing, Beijing, 100084, PR China
- “Biomanufacturing and Engineering Living Systems” Innovation International Talents Base (111 Base), Beijing, 100084, PR China
| | - Xianhao Zhou
- Biomanufacturing Center, Department of Mechanical Engineering, Tsinghua University, Beijing, 100084, PR China
- Biomanufacturing and Rapid Forming Technology Key Laboratory of Beijing, Beijing, 100084, PR China
- “Biomanufacturing and Engineering Living Systems” Innovation International Talents Base (111 Base), Beijing, 100084, PR China
| | - Yongcong Fang
- Biomanufacturing Center, Department of Mechanical Engineering, Tsinghua University, Beijing, 100084, PR China
- Biomanufacturing and Rapid Forming Technology Key Laboratory of Beijing, Beijing, 100084, PR China
- “Biomanufacturing and Engineering Living Systems” Innovation International Talents Base (111 Base), Beijing, 100084, PR China
- State Key Laboratory of Tribology in Advanced Equipment, Tsinghua University, Beijing, 100084, PR China
| | - Zhuo Xiong
- Biomanufacturing Center, Department of Mechanical Engineering, Tsinghua University, Beijing, 100084, PR China
- Biomanufacturing and Rapid Forming Technology Key Laboratory of Beijing, Beijing, 100084, PR China
- “Biomanufacturing and Engineering Living Systems” Innovation International Talents Base (111 Base), Beijing, 100084, PR China
| | - Ting Zhang
- Biomanufacturing Center, Department of Mechanical Engineering, Tsinghua University, Beijing, 100084, PR China
- Biomanufacturing and Rapid Forming Technology Key Laboratory of Beijing, Beijing, 100084, PR China
- “Biomanufacturing and Engineering Living Systems” Innovation International Talents Base (111 Base), Beijing, 100084, PR China
- State Key Laboratory of Tribology in Advanced Equipment, Tsinghua University, Beijing, 100084, PR China
| |
Collapse
|
2
|
Rabe M, Kurz C, Thummerer A, Landry G. Artificial intelligence for treatment delivery: image-guided radiotherapy. Strahlenther Onkol 2025; 201:283-297. [PMID: 39138806 DOI: 10.1007/s00066-024-02277-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2024] [Accepted: 07/07/2024] [Indexed: 08/15/2024]
Abstract
Radiation therapy (RT) is a highly digitized field relying heavily on computational methods and, as such, has a high affinity for the automation potential afforded by modern artificial intelligence (AI). This is particularly relevant where imaging is concerned and is especially so during image-guided RT (IGRT). With the advent of online adaptive RT (ART) workflows at magnetic resonance (MR) linear accelerators (linacs) and at cone-beam computed tomography (CBCT) linacs, the need for automation is further increased. AI as applied to modern IGRT is thus one area of RT where we can expect important developments in the near future. In this review article, after outlining modern IGRT and online ART workflows, we cover the role of AI in CBCT and MRI correction for dose calculation, auto-segmentation on IGRT imaging, motion management, and response assessment based on in-room imaging.
Collapse
Affiliation(s)
- Moritz Rabe
- Department of Radiation Oncology, LMU University Hospital, LMU Munich, Marchioninistraße 15, 81377, Munich, Bavaria, Germany
| | - Christopher Kurz
- Department of Radiation Oncology, LMU University Hospital, LMU Munich, Marchioninistraße 15, 81377, Munich, Bavaria, Germany
| | - Adrian Thummerer
- Department of Radiation Oncology, LMU University Hospital, LMU Munich, Marchioninistraße 15, 81377, Munich, Bavaria, Germany
| | - Guillaume Landry
- Department of Radiation Oncology, LMU University Hospital, LMU Munich, Marchioninistraße 15, 81377, Munich, Bavaria, Germany.
- German Cancer Consortium (DKTK), partner site Munich, a partnership between the DKFZ and the LMU University Hospital Munich, Marchioninistraße 15, 81377, Munich, Bavaria, Germany.
- Bavarian Cancer Research Center (BZKF), Marchioninistraße 15, 81377, Munich, Bavaria, Germany.
| |
Collapse
|
3
|
Grube S, Latus S, Behrendt F, Riabova O, Neidhardt M, Schlaefer A. Needle tracking in low-resolution ultrasound volumes using deep learning. Int J Comput Assist Radiol Surg 2024; 19:1975-1981. [PMID: 39002100 PMCID: PMC11442564 DOI: 10.1007/s11548-024-03234-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2024] [Accepted: 07/03/2024] [Indexed: 07/15/2024]
Abstract
PURPOSE Clinical needle insertion into tissue, commonly assisted by 2D ultrasound imaging for real-time navigation, faces the challenge of precise needle and probe alignment to reduce out-of-plane movement. Recent studies investigate 3D ultrasound imaging together with deep learning to overcome this problem, focusing on acquiring high-resolution images to create optimal conditions for needle tip detection. However, high-resolution also requires a lot of time for image acquisition and processing, which limits the real-time capability. Therefore, we aim to maximize the US volume rate with the trade-off of low image resolution. We propose a deep learning approach to directly extract the 3D needle tip position from sparsely sampled US volumes. METHODS We design an experimental setup with a robot inserting a needle into water and chicken liver tissue. In contrast to manual annotation, we assess the needle tip position from the known robot pose. During insertion, we acquire a large data set of low-resolution volumes using a 16 × 16 element matrix transducer with a volume rate of 4 Hz. We compare the performance of our deep learning approach with conventional needle segmentation. RESULTS Our experiments in water and liver show that deep learning outperforms the conventional approach while achieving sub-millimeter accuracy. We achieve mean position errors of 0.54 mm in water and 1.54 mm in liver for deep learning. CONCLUSION Our study underlines the strengths of deep learning to predict the 3D needle positions from low-resolution ultrasound volumes. This is an important milestone for real-time needle navigation, simplifying the alignment of needle and ultrasound probe and enabling a 3D motion analysis.
Collapse
Affiliation(s)
- Sarah Grube
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany.
| | - Sarah Latus
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Finn Behrendt
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Oleksandra Riabova
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Maximilian Neidhardt
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| | - Alexander Schlaefer
- Institute of Medical Technology and Intelligent Systems, Hamburg University of Technology, Hamburg, Germany
| |
Collapse
|
4
|
Smahi A, Lakhal O, Chettibi T, Sanz Lopez M, Pasquier D, Merzouki R. Adaptive approach for tracking movements of biological targets: application to robot-based intervention for prostate cancer. Front Robot AI 2024; 11:1416662. [PMID: 39188571 PMCID: PMC11345532 DOI: 10.3389/frobt.2024.1416662] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2024] [Accepted: 07/16/2024] [Indexed: 08/28/2024] Open
Abstract
Introduction In this paper, we introduce an advanced robotic system integrated with an adaptive optimization algorithm, tailored for Brachytherapy in prostate cancer treatment. The primary innovation of the system is the algorithm itself, designed to dynamically adjust needle trajectories in response to the real-time movements of the prostate gland during the local intervention. Methods The system employs real-time position data extracted from Magnetic Resonance Imaging (MRI) to ensure precise targeting of the prostate, adapting to its constant motion and deformation. This precision is crucial in Brachytherapy, where the accurate placement of radioactive seeds directly impacts the efficacy of the treatment and minimizes damage to surrounding safe tissues. Results Our results demonstrate a marked improvement in the accuracy of radiation seed placement, directly correlating to more effective radiation delivery. The adaptive nature of the algorithm significantly reduces the number of needle insertions, leading to a less invasive treatment experience for patients. This reduction in needle insertions also contributes to lower risks of infection and shorter recovery times. Discussion This novel robotic system, enhanced by the adaptive optimization algorithm, improves the coverage of targets reached by a traditional combinatorial approach by approximately 15% with fewer required needles. The improved precision and reduced invasiveness highlight the potential of this system to enhance the overall effectiveness and patient experience in prostate cancer Brachytherapy.
Collapse
Affiliation(s)
- Abdeslem Smahi
- CRIStAL, CNRS-UMR 9189, University of Lille, Villeneuve d’Ascq, France
| | - Othman Lakhal
- CRIStAL, CNRS-UMR 9189, University of Lille, Villeneuve d’Ascq, France
| | - Taha Chettibi
- Department of Mechanical Engineering, Blida-1 University, Blida, Algeria
| | - Mario Sanz Lopez
- CRIStAL, CNRS-UMR 9189, University of Lille, Villeneuve d’Ascq, France
| | - David Pasquier
- CRIStAL, CNRS-UMR 9189, University of Lille, Villeneuve d’Ascq, France
- Academic Department of Radiation Oncology, Centre O. Lambret, Lille, France
| | - Rochdi Merzouki
- CRIStAL, CNRS-UMR 9189, University of Lille, Villeneuve d’Ascq, France
| |
Collapse
|
5
|
Wu Y, Wang Z, Chu Y, Peng R, Peng H, Yang H, Guo K, Zhang J. Current Research Status of Respiratory Motion for Thorax and Abdominal Treatment: A Systematic Review. Biomimetics (Basel) 2024; 9:170. [PMID: 38534855 DOI: 10.3390/biomimetics9030170] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Revised: 02/29/2024] [Accepted: 03/09/2024] [Indexed: 03/28/2024] Open
Abstract
Malignant tumors have become one of the serious public health problems in human safety and health, among which the chest and abdomen diseases account for the largest proportion. Early diagnosis and treatment can effectively improve the survival rate of patients. However, respiratory motion in the chest and abdomen can lead to uncertainty in the shape, volume, and location of the tumor, making treatment of the chest and abdomen difficult. Therefore, compensation for respiratory motion is very important in clinical treatment. The purpose of this review was to discuss the research and development of respiratory movement monitoring and prediction in thoracic and abdominal surgery, as well as introduce the current research status. The integration of modern respiratory motion compensation technology with advanced sensor detection technology, medical-image-guided therapy, and artificial intelligence technology is discussed and analyzed. The future research direction of intraoperative thoracic and abdominal respiratory motion compensation should be non-invasive, non-contact, use a low dose, and involve intelligent development. The complexity of the surgical environment, the constraints on the accuracy of existing image guidance devices, and the latency of data transmission are all present technical challenges.
Collapse
Affiliation(s)
- Yuwen Wu
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
| | - Zhisen Wang
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230026, China
| | - Yuyi Chu
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
| | - Renyuan Peng
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230026, China
| | - Haoran Peng
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230026, China
| | - Hongbo Yang
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230026, China
| | - Kai Guo
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230026, China
| | - Juzhong Zhang
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230026, China
| |
Collapse
|