1
|
Andersson E, Hult J, Troein C, Stridh M, Sjögren B, Pekar-Lukacs A, Hernandez-Palacios J, Edén P, Persson B, Olariu V, Malmsjö M, Merdasa A. Facilitating clinically relevant skin tumor diagnostics with spectroscopy-driven machine learning. iScience 2024; 27:109653. [PMID: 38680659 PMCID: PMC11053315 DOI: 10.1016/j.isci.2024.109653] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2023] [Revised: 03/26/2024] [Accepted: 04/01/2024] [Indexed: 05/01/2024] Open
Abstract
In the dawning era of artificial intelligence (AI), health care stands to undergo a significant transformation with the increasing digitalization of patient data. Digital imaging, in particular, will serve as an important platform for AI to aid decision making and diagnostics. A growing number of studies demonstrate the potential of automatic pre-surgical skin tumor delineation, which could have tremendous impact on clinical practice. However, current methods rely on having ground truth images in which tumor borders are already identified, which is not clinically possible. We report a novel approach where hyperspectral images provide spectra from small regions representing healthy tissue and tumor, which are used to generate prediction maps using artificial neural networks (ANNs), after which a segmentation algorithm automatically identifies the tumor borders. This circumvents the need for ground truth images, since an ANN model is trained with data from each individual patient, representing a more clinically relevant approach.
Collapse
Affiliation(s)
- Emil Andersson
- Centre for Environmental and Climate Science, Lund University, Lund, Sweden
| | - Jenny Hult
- Department of Clinical Sciences Lund, Ophthalmology, Lund University, Lund, Sweden
| | - Carl Troein
- Centre for Environmental and Climate Science, Lund University, Lund, Sweden
| | - Magne Stridh
- Department of Clinical Sciences Lund, Ophthalmology, Lund University, Lund, Sweden
| | - Benjamin Sjögren
- Department of Clinical Sciences Lund, Ophthalmology, Lund University, Lund, Sweden
| | | | | | - Patrik Edén
- Centre for Environmental and Climate Science, Lund University, Lund, Sweden
| | - Bertil Persson
- Department of Dermatology, Skåne University Hospital, Lund, Sweden
| | - Victor Olariu
- Centre for Environmental and Climate Science, Lund University, Lund, Sweden
| | - Malin Malmsjö
- Department of Clinical Sciences Lund, Ophthalmology, Lund University, Lund, Sweden
| | - Aboma Merdasa
- Department of Clinical Sciences Lund, Ophthalmology, Lund University, Lund, Sweden
| |
Collapse
|
2
|
Dermoscopic Image Classification of Pigmented Nevus under Deep Learning and the Correlation with Pathological Features. COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE 2022; 2022:9726181. [PMID: 35669372 PMCID: PMC9167096 DOI: 10.1155/2022/9726181] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/04/2022] [Revised: 04/19/2022] [Accepted: 04/23/2022] [Indexed: 11/26/2022]
Abstract
The objective of this study was to explore the image classification and case characteristics of pigmented nevus (PN) diagnosed by dermoscopy under deep learning. 268 patients were included as the research objects and they were randomly divided into observation group (n = 134) and control group (n = 134). Image recognition algorithm was used for feature extraction, segmentation, and classification of dermoscopic images, and the image recognition and classification algorithm were studied as the performance and accuracy of fusion classifier were compared. The results showed that the classifier was optimized, and the linear kernel accuracy was 85.82%. The PN studied mainly included mixed nevus, junctional nevus, intradermal nevus, and acral nevus. The sensitivity under collaborative training was higher than that under feature training and fusion feature training, and the differences among three trainings were significant (P < 0.05). The sensitivity of the observation group was 88.65%, and the specificity was 90.26%, while the sensitivity and the specificity of the control group were 85.65% and 84.03%, respectively; there were significant differences between the two groups (P < 0.05). In conclusion, dermoscopy under deep learning could be applied as a diagnostic way of PN, which helped improve the accuracy of diagnosis. The dermoscopic manifestations of PN showed a certain corresponding relationship with the type of cases and could provide auxiliary diagnosis in clinical practice. It could be applied clinically.
Collapse
|