1
|
Wu P, Liu Y, Li Y, Liu B. Robust Prostate Segmentation Using Intrinsic Properties of TRUS Images. IEEE TRANSACTIONS ON MEDICAL IMAGING 2015; 34:1321-1335. [PMID: 25576565 DOI: 10.1109/tmi.2015.2388699] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
Accurate segmentation is usually crucial in transrectal ultrasound (TRUS) image based prostate diagnosis; however, it is always hampered by heavy speckles. Contrary to the traditional view that speckles are adverse to segmentation, we exploit intrinsic properties induced by speckles to facilitate the task, based on the observations that sizes and orientations of speckles provide salient cues to determine the prostate boundary. Since the speckle orientation changes in accordance with a statistical prior rule, rotation-invariant texture feature is extracted along the orientations revealed by the rule. To address the problem of feature changes due to different speckle sizes, TRUS images are split into several arc-like strips. In each strip, every individual feature vector is sparsely represented, and representation residuals are obtained. The residuals, along with the spatial coherence inherited from biological tissues, are combined to segment the prostate preliminarily via graph cuts. After that, the segmentation is fine-tuned by a novel level sets model, which integrates (1) the prostate shape prior, (2) dark-to-light intensity transition near the prostate boundary, and (3) the texture feature just obtained. The proposed method is validated on two 2-D image datasets obtained from two different sonographic imaging systems, with the mean absolute distance on the mid gland images only 1.06±0.53 mm and 1.25±0.77 mm, respectively. The method is also extended to segment apex and base images, producing competitive results over the state of the art.
Collapse
|
2
|
Qiu W, Yuan J, Ukwatta E, Sun Y, Rajchl M, Fenster A. Prostate segmentation: an efficient convex optimization approach with axial symmetry using 3-D TRUS and MR images. IEEE TRANSACTIONS ON MEDICAL IMAGING 2014; 33:947-960. [PMID: 24710163 DOI: 10.1109/tmi.2014.2300694] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
We propose a novel global optimization-based approach to segmentation of 3-D prostate transrectal ultrasound (TRUS) and T2 weighted magnetic resonance (MR) images, enforcing inherent axial symmetry of prostate shapes to simultaneously adjust a series of 2-D slice-wise segmentations in a "global" 3-D sense. We show that the introduced challenging combinatorial optimization problem can be solved globally and exactly by means of convex relaxation. In this regard, we propose a novel coherent continuous max-flow model (CCMFM), which derives a new and efficient duality-based algorithm, leading to a GPU-based implementation to achieve high computational speeds. Experiments with 25 3-D TRUS images and 30 3-D T2w MR images from our dataset, and 50 3-D T2w MR images from a public dataset, demonstrate that the proposed approach can segment a 3-D prostate TRUS/MR image within 5-6 s including 4-5 s for initialization, yielding a mean Dice similarity coefficient of 93.2%±2.0% for 3-D TRUS images and 88.5%±3.5% for 3-D MR images. The proposed method also yields relatively low intra- and inter-observer variability introduced by user manual initialization, suggesting a high reproducibility, independent of observers.
Collapse
|
3
|
Mahdavi SS, Moradi M, Morris WJ, Goldenberg SL, Salcudean SE. Fusion of ultrasound B-mode and vibro-elastography images for automatic 3D segmentation of the prostate. IEEE TRANSACTIONS ON MEDICAL IMAGING 2012; 31:2073-2082. [PMID: 22829391 DOI: 10.1109/tmi.2012.2209204] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
Prostate segmentation in B-mode images is a challenging task even when done manually by experts. In this paper we propose a 3D automatic prostate segmentation algorithm which makes use of information from both ultrasound B-mode and vibro-elastography data.We exploit the high contrast to noise ratio of vibro-elastography images of the prostate, in addition to the commonly used B-mode images, to implement a 2D Active Shape Model (ASM)-based segmentation algorithm on the midgland image. The prostate model is deformed by a combination of two measures: the gray level similarity and the continuity of the prostate edge in both image types. The automatically obtained mid-gland contour is then used to initialize a 3D segmentation algorithm which models the prostate as a tapered and warped ellipsoid. Vibro-elastography images are used in addition to ultrasound images to improve boundary detection.We report a Dice similarity coefficient of 0.87±0.07 and 0.87±0.08 comparing the 2D automatic contours with manual contours of two observers on 61 images. For 11 cases, a whole gland volume error of 10.2±2.2% and 13.5±4.1% and whole gland volume difference of -7.2±9.1% and -13.3±12.6% between 3D automatic and manual surfaces of two observers is obtained. This is the first validated work showing the fusion of B-mode and vibro-elastography data for automatic 3D segmentation of the prostate.
Collapse
|
4
|
Yang X, Fei B. 3D Prostate Segmentation of Ultrasound Images Combining Longitudinal Image Registration and Machine Learning. PROCEEDINGS OF SPIE--THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING 2012; 8316:83162O. [PMID: 24027622 DOI: 10.1117/12.912188] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
We developed a three-dimensional (3D) segmentation method for transrectal ultrasound (TRUS) images, which is based on longitudinal image registration and machine learning. Using longitudinal images of each individual patient, we register previously acquired images to the new images of the same subject. Three orthogonal Gabor filter banks were used to extract texture features from each registered image. Patient-specific Gabor features from the registered images are used to train kernel support vector machines (KSVMs) and then to segment the newly acquired prostate image. The segmentation method was tested in TRUS data from five patients. The average surface distance between our and manual segmentation is 1.18 ± 0.31 mm, indicating that our automatic segmentation method based on longitudinal image registration is feasible for segmenting the prostate in TRUS images.
Collapse
Affiliation(s)
- Xiaofeng Yang
- Department of Radiology and Imaging Sciences, Emory University, Atlanta, GA
| | | |
Collapse
|
5
|
Fei B, Schuster DM, Master V, Akbari H, Fenster A, Nieh P. A Molecular Image-directed, 3D Ultrasound-guided Biopsy System for the Prostate. PROCEEDINGS OF SPIE--THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING 2012; 2012. [PMID: 22708023 DOI: 10.1117/12.912182] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Systematic transrectal ultrasound (TRUS)-guided biopsy is the standard method for a definitive diagnosis of prostate cancer. However, this biopsy approach uses two-dimensional (2D) ultrasound images to guide biopsy and can miss up to 30% of prostate cancers. We are developing a molecular image-directed, three-dimensional (3D) ultrasound image-guided biopsy system for improved detection of prostate cancer. The system consists of a 3D mechanical localization system and software workstation for image segmentation, registration, and biopsy planning. In order to plan biopsy in a 3D prostate, we developed an automatic segmentation method based wavelet transform. In order to incorporate PET/CT images into ultrasound-guided biopsy, we developed image registration methods to fuse TRUS and PET/CT images. The segmentation method was tested in ten patients with a DICE overlap ratio of 92.4% ± 1.1 %. The registration method has been tested in phantoms. The biopsy system was tested in prostate phantoms and 3D ultrasound images were acquired from two human patients. We are integrating the system for PET/CT directed, 3D ultrasound-guided, targeted biopsy in human patients.
Collapse
Affiliation(s)
- Baowei Fei
- Department of Radiology and Imaging Sciences, Emory University, Atlanta, GA 30329
| | | | | | | | | | | |
Collapse
|
6
|
Wong A, Scharcanski J. Fisher-Tippett region-merging approach to transrectal ultrasound prostate lesion segmentation. IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE : A PUBLICATION OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY 2011; 15:900-7. [PMID: 21824854 DOI: 10.1109/titb.2011.2163724] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
In this paper, a computerized approach to segmenting prostate lesions in transrectal ultrasound (TRUS) images is presented. The segmentation of prostate lesions from TRUS images is very challenging due to issues, such as poor contrast, low SNRs, and irregular shape variations. To address these issues, a novel approach is employed to segment the lesions from the surrounding prostate, where region merging is performed via a region-merging likelihood function based on regional statistics, as well as Fisher-Tippett statistics. Experimental results using TRUS prostate images demonstrate that the proposed Fisher-Tippett region-merging approach achieves more accurate segmentation of prostate lesions when compared to other segmentation methods.
Collapse
Affiliation(s)
- Alexander Wong
- Department of Systems Design Engineering, University of Waterloo, Waterloo, ON, Canada.
| | | |
Collapse
|
7
|
Garnier C, Bellanger JJ, Wu K, Shu H, Costet N, Mathieu R, De Crevoisier R, Coatrieux JL. Prostate segmentation in HIFU therapy. IEEE TRANSACTIONS ON MEDICAL IMAGING 2011; 30:792-803. [PMID: 21118767 PMCID: PMC3095593 DOI: 10.1109/tmi.2010.2095465] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Prostate segmentation in 3-D transrectal ultrasound images is an important step in the definition of the intra-operative planning of high intensity focused ultrasound (HIFU) therapy. This paper presents two main approaches for the semi-automatic methods based on discrete dynamic contour and optimal surface detection. They operate in 3-D and require a minimal user interaction. They are considered both alone or sequentially combined, with and without postregularization, and applied on anisotropic and isotropic volumes. Their performance, using different metrics, has been evaluated on a set of 28 3-D images by comparison with two expert delineations. For the most efficient algorithm, the symmetric average surface distance was found to be 0.77 mm.
Collapse
Affiliation(s)
- Carole Garnier
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
| | - Jean-Jacques Bellanger
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
| | - Ke Wu
- CRIBS, Centre de Recherche en Information Biomédicale sino-français
INSERM : LABORATOIRE INTERNATIONAL ASSOCIÉUniversité de Rennes ISouthEast UniversityRennes,FR
- LIST, Laboratory of Image Science and Technology
SouthEast UniversitySi Pai Lou 2, Nanjing, 210096,CN
| | - Huazhong Shu
- CRIBS, Centre de Recherche en Information Biomédicale sino-français
INSERM : LABORATOIRE INTERNATIONAL ASSOCIÉUniversité de Rennes ISouthEast UniversityRennes,FR
- LIST, Laboratory of Image Science and Technology
SouthEast UniversitySi Pai Lou 2, Nanjing, 210096,CN
| | - Nathalie Costet
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
| | - Romain Mathieu
- Service d'urologie
CHU RennesHôpital PontchaillouUniversité de Rennes I2 rue Henri Le Guilloux 35033 Rennes cedex 9,FR
| | - Renaud De Crevoisier
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
- Département de radiothérapie
CRLCC Eugène Marquis35000 Rennes,FR
| | - Jean-Louis Coatrieux
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
- CRIBS, Centre de Recherche en Information Biomédicale sino-français
INSERM : LABORATOIRE INTERNATIONAL ASSOCIÉUniversité de Rennes ISouthEast UniversityRennes,FR
- * Correspondence should be adressed to: Jean-Louis Coatrieux
| |
Collapse
|
8
|
Mahdavi SS, Chng N, Spadinger I, Morris WJ, Salcudean SE. Semi-automatic segmentation for prostate interventions. Med Image Anal 2010; 15:226-37. [PMID: 21084216 DOI: 10.1016/j.media.2010.10.002] [Citation(s) in RCA: 53] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2009] [Revised: 09/05/2010] [Accepted: 10/19/2010] [Indexed: 11/24/2022]
Abstract
In this paper we report and characterize a semi-automatic prostate segmentation method for prostate brachytherapy. Based on anatomical evidence and requirements of the treatment procedure, a warped and tapered ellipsoid was found suitable as the a-priori 3D shape of the prostate. By transforming the acquired endorectal transverse images of the prostate into ellipses, the shape fitting problem was cast into a convex problem which can be solved efficiently. The average whole gland error between non-overlapping volumes created from manual and semi-automatic contours from 21 patients was 6.63 ± 0.9%. For use in brachytherapy treatment planning, the resulting contours were modified, if deemed necessary, by radiation oncologists prior to treatment. The average whole gland volume error between the volumes computed from semi-automatic contours and those computed from modified contours, from 40 patients, was 5.82 ± 4.15%. The amount of bias in the physicians' delineations when given an initial semi-automatic contour was measured by comparing the volume error between 10 prostate volumes computed from manual contours with those of modified contours. This error was found to be 7.25 ± 0.39% for the whole gland. Automatic contouring reduced subjectivity, as evidenced by a decrease in segmentation inter- and intra-observer variability from 4.65% and 5.95% for manual segmentation to 3.04% and 3.48% for semi-automatic segmentation, respectively. We characterized the performance of the method relative to the reference obtained from manual segmentation by using a novel approach that divides the prostate region into nine sectors. We analyzed each sector independently as the requirements for segmentation accuracy depend on which region of the prostate is considered. The measured segmentation time is 14 ± 1s with an additional 32 ± 14s for initialization. By assuming 1-3 min for modification of the contours, if necessary, a total segmentation time of less than 4 min is required, with no additional time required prior to treatment planning. This compares favorably to the 5-15 min manual segmentation time required for experienced individuals. The method is currently used at the British Columbia Cancer Agency (BCCA) Vancouver Cancer Centre as part of the standard treatment routine in low dose rate prostate brachytherapy and is found to be a fast, consistent and accurate tool for the delineation of the prostate gland in ultrasound images.
Collapse
Affiliation(s)
- S Sara Mahdavi
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada.
| | | | | | | | | |
Collapse
|
9
|
Abstract
Prostate segmentation from trans-rectal transverse B-mode ultrasound images is required for radiation treatment of prostate cancer. Manual segmentation is a time-consuming task, the results of which are dependent on image quality and physicians' experience. This paper introduces a semi-automatic 3D method based on super-ellipsoidal shapes. It produces a 3D segmentation in less than 15 seconds using a warped, tapered ellipsoid fit to the prostate. A study of patient images shows good performance and repeatability. This method is currently in clinical use at the Vancouver Cancer Center where it has become the standard segmentation procedure for low dose-rate brachytherapy treatment.
Collapse
|
10
|
Liang K, Rogers AJ, Light ED, Von Allmen D, Smith SW. Simulation of autonomous robotic multiple-core biopsy by 3D ultrasound guidance. ULTRASONIC IMAGING 2010; 32:118-127. [PMID: 20687279 PMCID: PMC3018680 DOI: 10.1177/016173461003200205] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
An autonomous multiple-core biopsy system guided by real-time 3D ultrasound and operated by a robotic arm with 6+1 degrees of freedom has been developed. Using a specimen of turkey breast as a tissue phantom, our system was able to first autonomously locate the phantom in the image volume and then perform needle sticks in each of eight sectors in the phantom in a single session, with no human intervention required. Based on the fraction of eight sectors successfully sampled in an experiment of five trials, a success rate of 93% was recorded. This system could have relevance in clinical procedures that involve multiple needle-core sampling such as prostate or breast biopsy.
Collapse
Affiliation(s)
- Kaicheng Liang
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA.
| | | | | | | | | |
Collapse
|
11
|
Wu X, Spencer SA, Shen S, Fiveash JB, Duan J, Brezovich IA. Development of an accelerated GVF semi-automatic contouring algorithm for radiotherapy treatment planning. Comput Biol Med 2009; 39:650-6. [DOI: 10.1016/j.compbiomed.2009.05.001] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2007] [Revised: 04/13/2009] [Accepted: 05/09/2009] [Indexed: 11/26/2022]
|
12
|
El Naqa I, Yang D, Apte A, Khullar D, Mutic S, Zheng J, Bradley JD, Grigsby P, Deasy JO. Concurrent multimodality image segmentation by active contours for radiotherapy treatment planning. Med Phys 2008; 34:4738-49. [PMID: 18196801 DOI: 10.1118/1.2799886] [Citation(s) in RCA: 79] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
Abstract
Multimodality imaging information is regularly used now in radiotherapy treatment planning for cancer patients. The authors are investigating methods to take advantage of all the imaging information available for joint target registration and segmentation, including multimodality images or multiple image sets from the same modality. In particular, the authors have developed variational methods based on multivalued level set deformable models for simultaneous 2D or 3D segmentation of multimodality images consisting of combinations of coregistered PET, CT, or MR data sets. The combined information is integrated to define the overall biophysical structure volume. The authors demonstrate the methods on three patient data sets, including a nonsmall cell lung cancer case with PET/CT, a cervix cancer case with PET/CT, and a prostate patient case with CT and MRI. CT, PET, and MR phantom data were also used for quantitative validation of the proposed multimodality segmentation approach. The corresponding Dice similarity coefficient (DSC) was 0.90 +/- 0.02 (p < 0.0001) with an estimated target volume error of 1.28 +/- 1.23% volume. Preliminary results indicate that concurrent multimodality segmentation methods can provide a feasible and accurate framework for combining imaging data from different modalities and are potentially useful tools for the delineation of biophysical structure volumes in radiotherapy treatment planning.
Collapse
Affiliation(s)
- Issam El Naqa
- Department of Radiation Oncology, School of Medicine, Washington University, St. Louis, Missouri 63110, USA.
| | | | | | | | | | | | | | | | | |
Collapse
|
13
|
Sahba F, Tizhoosh HR, Salama MMA. Application of reinforcement learning for segmentation of transrectal ultrasound images. BMC Med Imaging 2008; 8:8. [PMID: 18430220 PMCID: PMC2397386 DOI: 10.1186/1471-2342-8-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2007] [Accepted: 04/22/2008] [Indexed: 11/10/2022] Open
Abstract
Background Among different medical image modalities, ultrasound imaging has a very widespread clinical use. But, due to some factors, such as poor image contrast, noise and missing or diffuse boundaries, the ultrasound images are inherently difficult to segment. An important application is estimation of the location and volume of the prostate in transrectal ultrasound (TRUS) images. For this purpose, manual segmentation is a tedious and time consuming procedure. Methods We introduce a new method for the segmentation of the prostate in transrectal ultrasound images, using a reinforcement learning scheme. This algorithm is used to find the appropriate local values for sub-images and to extract the prostate. It contains an offline stage, where the reinforcement learning agent uses some images and manually segmented versions of these images to learn from. The reinforcement agent is provided with reward/punishment, determined objectively to explore/exploit the solution space. After this stage, the agent has acquired knowledge stored in the Q-matrix. The agent can then use this knowledge for new input images to extract a coarse version of the prostate. Results We have carried out experiments to segment TRUS images. The results demonstrate the potential of this approach in the field of medical image segmentation. Conclusion By using the proposed method, we can find the appropriate local values and segment the prostate. This approach can be used for segmentation tasks containing one object of interest. To improve this prototype, more investigations are needed.
Collapse
Affiliation(s)
- Farhang Sahba
- Medical Instrument Analysis and Machine Intelligence Group, University of Waterloo, Waterloo, Canada.
| | | | | |
Collapse
|
14
|
Haas B, Coradi T, Scholz M, Kunz P, Huber M, Oppitz U, André L, Lengkeek V, Huyskens D, van Esch A, Reddick R. Automatic segmentation of thoracic and pelvic CT images for radiotherapy planning using implicit anatomic knowledge and organ-specific segmentation strategies. Phys Med Biol 2008; 53:1751-71. [DOI: 10.1088/0031-9155/53/6/017] [Citation(s) in RCA: 82] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
15
|
Tutar IB, Pathak SD, Gong L, Cho PS, Wallner K, Kim Y. Semiautomatic 3-D prostate segmentation from TRUS images using spherical harmonics. IEEE TRANSACTIONS ON MEDICAL IMAGING 2006; 25:1645-54. [PMID: 17167999 DOI: 10.1109/tmi.2006.884630] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/13/2023]
Abstract
Prostate brachytherapy quality assessment procedure should be performed while the patient is still on the operating table since this would enable physicians to implant additional seeds immediately into the prostate if necessary thus reducing the costs and increasing patient outcome. Seed placement procedure is readily performed under fluoroscopy and ultrasound guidance. Therefore, it has been proposed that seed locations be reconstructed from fluoroscopic images and prostate boundaries be identified in ultrasound images to perform dosimetry in the operating room. However, there is a key hurdle that needs to be overcome to perform the ultrasound and fluoroscopy-based dosimetry: it is highly time-consuming for physicians to outline prostate boundaries in ultrasound images manually, and there is no method that enables physicians to identify three-dimensional (3-D) prostate boundaries in postimplant ultrasound images in a fast and robust fashion. In this paper, we propose a new method where the segmentation is defined in an optimization framework as fitting the best surface to the underlying images under shape constraints. To derive these constraints, we modeled the shape of the prostate using spherical harmonics of degree eight and performed statistical analysis on the shape parameters. After user initialization, our algorithm identifies the prostate boundaries on the average in 2 min. For algorithm validation, we collected 30 postimplant prostate volume sets, each consisting of axial transrectal ultrasound images acquired at 1-mm increments. For each volume set, three experts outlined the prostate boundaries first manually and then using our algorithm. By treating the average of manual boundaries as the ground truth, we computed the segmentation error. The overall mean absolute distance error was 1.26 +/- 0.41 mm while the percent volume overlap was 83.5 +/- 4.2. We found the segmentation error to be slightly less than the clinically-observed interobserver variability.
Collapse
Affiliation(s)
- Ismail B Tutar
- Image Computing Systems Laboratory, Departments of Electrical Engineering and Bioengineering, University of Washington, Seattle, WA 98195, USA
| | | | | | | | | | | |
Collapse
|