1
|
Alessio AM, Kinahan PE, Sauer K, Kalra MK, De Man B. Comparison Between Pre-Log and Post-Log Statistical Models in Ultra-Low-Dose CT Reconstruction. IEEE TRANSACTIONS ON MEDICAL IMAGING 2017; 36:707-720. [PMID: 28113926 PMCID: PMC5424567 DOI: 10.1109/tmi.2016.2627004] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
X-ray detectors in clinical computed tomography (CT) usually operate in current-integrating mode. Their complicated signal statistics often lead to intractable likelihood functions for practical use in model-based image reconstruction (MBIR). It is therefore desirable to design simplified statistical models without losing the essential factors. Depending on whether the CT transmission data are logarithmically transformed, pre-log and post-log models are two major categories of choices in CT MBIR. Both being approximations, it remains an open question whether one model can notably improve image quality over the other on real scanners. In this study, we develop and compare several pre-log and post-log MBIR algorithms under a unified framework. Their reconstruction accuracy based on simulation and clinical datasets are evaluated. The results show that pre-log MBIR can achieve notably better quantitative accuracy than post-log MBIR in ultra-low-dose CT, although in less extreme cases, post-log MBIR with handcrafted pre-processing remains a competitive alternative. Pre-log MBIR could play a growing role in emerging ultra-low-dose CT applications.
Collapse
|
2
|
Schmidt TG, Zimmerman KC, Sidky EY. The effects of extending the spectral information acquired by a photon-counting detector for spectral CT. Phys Med Biol 2015; 60:1583-600. [PMID: 25615511 DOI: 10.1088/0031-9155/60/4/1583] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Photon-counting x-ray detectors with pulse-height analysis provide spectral information that may improve material decomposition and contrast-to-noise ratio (CNR) in CT images. The number of energy measurements that can be acquired simultaneously on a detector pixel is equal to the number of comparator channels. Some spectral CT designs have a limited number of comparator channels, due to the complexity of readout electronics. The spectral information could be extended by changing the comparator threshold levels over time, sub pixels, or view angle. However, acquiring more energy measurements than comparator channels increases the noise and/or dose, due to differences in noise correlations across energy measurements and decreased dose utilisation. This study experimentally quantified the effects of acquiring more energy measurements than comparator channels using a bench-top spectral CT system. An analytical and simulation study modeling an ideal detector investigated whether there was a net benefit for material decomposition or optimal energy weighting when acquiring more energy measurements than comparator channels. Experimental results demonstrated that in a two-threshold acquisition, acquiring the high-energy measurement independently from the low-energy measurement increased noise standard deviation in material-decomposition basis images by factors of 1.5-1.7 due to changes in covariance between energy measurements. CNR in energy-weighted images decreased by factors of 0.92-0.71. Noise standard deviation increased by an additional factor of [Formula: see text] due to reduced dose utilisation. The results demonstrated no benefit for two-material decomposition noise or energy-weighted CNR when acquiring more energy measurements than comparator channels. Understanding the noise penalty of acquiring more energy measurements than comparator channels is important for designing spectral detectors and for designing experiments and interpreting data from prototype systems with a limited number of comparator channels.
Collapse
Affiliation(s)
- Taly Gilat Schmidt
- Department of Biomedical Engineering, Marquette University, Milwaukee, WI 53233, USA
| | | | | |
Collapse
|
3
|
Long Y, Fessler JA. Multi-material decomposition using statistical image reconstruction for spectral CT. IEEE TRANSACTIONS ON MEDICAL IMAGING 2014; 33:1614-26. [PMID: 24801550 PMCID: PMC4125500 DOI: 10.1109/tmi.2014.2320284] [Citation(s) in RCA: 135] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
Spectral computed tomography (CT) provides information on material characterization and quantification because of its ability to separate different basis materials. Dual-energy (DE) CT provides two sets of measurements at two different source energies. In principle, two materials can be accurately decomposed from DECT measurements. However, many clinical and industrial applications require three or more material images. For triple-material decomposition, a third constraint, such as volume conservation, mass conservation or both, is required to solve three sets of unknowns from two sets of measurements. The recently proposed flexible image-domain (ID) multi-material decomposition) method assumes each pixel contains at most three materials out of several possible materials and decomposes a mixture pixel by pixel. We propose a penalized-likelihood (PL) method with edge-preserving regularizers for each material to reconstruct multi-material images using a similar constraint from sinogram data. We develop an optimization transfer method with a series of pixel-wise separable quadratic surrogate (PWSQS) functions to monotonically decrease the complicated PL cost function. The PWSQS algorithm separates pixels to allow simultaneous update of all pixels, but keeps the basis materials coupled to allow faster convergence rate than our previous proposed material- and pixel-wise SQS algorithms. Comparing with the ID method using 2-D fan-beam simulations, the PL method greatly reduced noise, streak and cross-talk artifacts in the reconstructed basis component images, and achieved much smaller root mean square errors.
Collapse
Affiliation(s)
- Yong Long
- CT Systems and Application Laboratory, GE Global Research Center,
Niskayuna, NY 12309
| | - Jeffrey A. Fessler
- Department of Electrical Engineering and Computer Science,
University of Michigan, Ann Arbor, MI 48109
| |
Collapse
|
4
|
Tong S, Alessio AM, Kinahan PE. Image reconstruction for PET/CT scanners: past achievements and future challenges. ACTA ACUST UNITED AC 2010; 2:529-545. [PMID: 21339831 DOI: 10.2217/iim.10.49] [Citation(s) in RCA: 69] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
PET is a medical imaging modality with proven clinical value for disease diagnosis and treatment monitoring. The integration of PET and CT on modern scanners provides a synergy of the two imaging modalities. Through different mathematical algorithms, PET data can be reconstructed into the spatial distribution of the injected radiotracer. With dynamic imaging, kinetic parameters of specific biological processes can also be determined. Numerous efforts have been devoted to the development of PET image reconstruction methods over the last four decades, encompassing analytic and iterative reconstruction methods. This article provides an overview of the commonly used methods. Current challenges in PET image reconstruction include more accurate quantitation, TOF imaging, system modeling, motion correction and dynamic reconstruction. Advances in these aspects could enhance the use of PET/CT imaging in patient care and in clinical research studies of pathophysiology and therapeutic interventions.
Collapse
Affiliation(s)
- Shan Tong
- Department of Radiology, University of Washington, Seattle WA, USA
| | | | | |
Collapse
|
5
|
Ramani S, Thevenaz P, Unser M. Regularized interpolation for noisy images. IEEE TRANSACTIONS ON MEDICAL IMAGING 2010; 29:543-558. [PMID: 20129854 DOI: 10.1109/tmi.2009.2038576] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Interpolation is the means by which a continuously defined model is fit to discrete data samples. When the data samples are exempt of noise, it seems desirable to build the model by fitting them exactly. In medical imaging, where quality is of paramount importance, this ideal situation unfortunately does not occur. In this paper, we propose a scheme that improves on the quality by specifying a tradeoff between fidelity to the data and robustness to the noise. We resort to variational principles, which allow us to impose smoothness constraints on the model for tackling noisy data. Based on shift-, rotation-, and scale-invariant requirements on the model, we show that the L(p)-norm of an appropriate vector derivative is the most suitable choice of regularization for this purpose. In addition to Tikhonov-like quadratic regularization, this includes edge-preserving total-variation-like (TV) regularization. We give algorithms to recover the continuously defined model from noisy samples and also provide a data-driven scheme to determine the optimal amount of regularization. We validate our method with numerical examples where we demonstrate its superiority over an exact fit as well as the benefit of TV-like nonquadratic regularization over Tikhonov-like quadratic regularization.
Collapse
Affiliation(s)
- Sathish Ramani
- Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI 48109 USA.
| | | | | |
Collapse
|
6
|
Xu J, Tsui BMW. Electronic noise modeling in statistical iterative reconstruction. IEEE TRANSACTIONS ON IMAGE PROCESSING : A PUBLICATION OF THE IEEE SIGNAL PROCESSING SOCIETY 2009; 18:1228-38. [PMID: 19398410 PMCID: PMC3107070 DOI: 10.1109/tip.2009.2017139] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
Abstract
We consider electronic noise modeling in tomographic image reconstruction when the measured signal is the sum of a Gaussian distributed electronic noise component and another random variable whose log-likelihood function satisfies a certain linearity condition. Examples of such likelihood functions include the Poisson distribution and an exponential dispersion (ED) model that can approximate the signal statistics in integration mode X-ray detectors. We formulate the image reconstruction problem as a maximum-likelihood estimation problem. Using an expectation-maximization approach, we demonstrate that a reconstruction algorithm can be obtained following a simple substitution rule from the one previously derived without electronic noise considerations. To illustrate the applicability of the substitution rule, we present examples of a fully iterative reconstruction algorithm and a sinogram smoothing algorithm both in transmission CT reconstruction when the measured signal contains additive electronic noise. Our simulation studies show the potential usefulness of accurate electronic noise modeling in low-dose CT applications.
Collapse
Affiliation(s)
- Jingyan Xu
- Johns Hopkins University, Baltimore, MD 21287-0859, USA.
| | | |
Collapse
|
7
|
Zhou J, Coatrieux JL, Luo L. Noniterative sequential weighted least squares algorithm for positron emission tomography reconstruction. Comput Med Imaging Graph 2008; 32:710-9. [PMID: 18842391 DOI: 10.1016/j.compmedimag.2008.08.008] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2007] [Revised: 08/21/2008] [Accepted: 08/22/2008] [Indexed: 11/28/2022]
Abstract
This paper proposes a new sequential weighted least squares (SWLS) method for positron emission tomography (PET) reconstruction. The SWLS algorithm is noniterative and can be considered as equivalent to the penalized WLS (PWLS) method under certain initial conditions. However, a full implementation of SWLS is computationally intensive. To overcome this problem, we propose a simplified SWLS as a reasonable alternative to the SWLS. The performance of this SWLS method is evaluated in experiments using both simulated and clinical data. The results show that the method can be advantageously compared with the original SWLS both in computation time and reconstruction quality.
Collapse
Affiliation(s)
- Jian Zhou
- Laboratory of Image Science and Technology, Southeast University, 210096 China.
| | | | | |
Collapse
|
8
|
An iterative reconstruction using median root prior and anatomical prior from the segmented mu-map for count-limited transmission data in PET imaging. Ann Nucl Med 2008; 22:269-79. [PMID: 18535877 DOI: 10.1007/s12149-007-0098-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2007] [Accepted: 12/12/2007] [Indexed: 10/22/2022]
Abstract
OBJECTIVE Recently, whole-body positron emission tomography (PET) examination has greatly developed. To reduce the overall examination time, the transmission scan has been increasingly shortened. Many noise-reduction processes have been developed for count-limited transmission data. Segmented attenuation correction (SAC) is one method by which the pixel values of transmission image are transformed into several groups. The median root prior-ordered subset convex (MRP-OSC) algorithm is another method that is applicable to control the noise level on the basis that the change of the pixel value is locally monotonous. This article presents an alternative approach on the basis of the Bayesian iterative reconstruction technique incorporating a median prior and an anatomical prior from the segmented mu-map for count-limited transmission data. METHODS The proposed method is based on the Bayesian iterative reconstruction technique. The median prior and the anatomical prior are represented as two Gibbs distributions. The product of these distributions was used as a penalty function. RESULTS In the thorax simulation study, the mean square error from the true transmission image of the presented method (5.74 x 10(-5)) was lower than MRP-OSC (6.72 x 10(-5)) and SAC (7.08 x 10(-5)). The results indicate that the noise of the image reconstructed from the proposed technique was decreased more than that of MRP-OSC without segmentation error such as that of an SAC image. In the thorax phantom study, the emission image that was corrected using the proposed technique displayed little noise and bias (27.42 +/- 0.96 kBq/ml, calculated from a region of interest drawn on the liver of the phantom); it was very similar to the true value (28.0 kBq/ml). CONCLUSIONS The proposed method is effective for reducing propagation of noise from transmission data to emission data without loss of the quantitative accuracy of the PET image.
Collapse
|
9
|
Li Q, Leahy RM. Statistical modeling and reconstruction of randoms precorrected PET data. IEEE TRANSACTIONS ON MEDICAL IMAGING 2006; 25:1565-72. [PMID: 17167992 DOI: 10.1109/tmi.2006.884193] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/13/2023]
Abstract
Randoms precorrected positron emission tomography (PET) data is formed as the difference of two Poisson random variables. Its exact probability mass function (PMF) is inconvenient for use in likelihood-based iterative image reconstruction as it contains an infinite summation. The shifted Poisson model is a tractable approximation to this PMF but requires that negative values are truncated, resulting in positively biased reconstructions in low count studies. Here we analyze the properties of the exact PMF and propose a simple but accurate approximation that allows negative valued data. We investigate the properties of this approximation and demonstrate its application to penalized maximum likelihood image reconstruction.
Collapse
Affiliation(s)
- Quanzheng Li
- Signal and Image Processing Institute, University of Southern California, Los Angeles, CA 90089 USA
| | | |
Collapse
|
10
|
Li Q, Asma E, Qi J, Bading JR, Leahy RM. Accurate estimation of the Fisher information matrix for the PET image reconstruction problem. IEEE TRANSACTIONS ON MEDICAL IMAGING 2004; 23:1057-1064. [PMID: 15377114 DOI: 10.1109/tmi.2004.833202] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
The Fisher information matrix (FIM) plays a key role in the analysis and applications of statistical image reconstruction methods based on Poisson data models. The elements of the FIM are a function of the reciprocal of the mean values of sinogram elements. Conventional plug-in FIM estimation methods do not work well at low counts, where the FIM estimate is highly sensitive to the reciprocal mean estimates at individual detector pairs. A generalized error look-up table (GELT) method is developed to estimate the reciprocal of the mean of the sinogram data. This approach is also extended to randoms precorrected data. Based on these techniques, an accurate FIM estimate is obtained for both Poisson and randoms precorrected data. As an application, the new GELT method is used to improve resolution uniformity and achieve near-uniform image resolution in low count situations.
Collapse
Affiliation(s)
- Quanzheng Li
- Signal and Image Processing Institute, Univ of Southern California, Los Angeles, CA 90089, USA
| | | | | | | | | |
Collapse
|
11
|
Panin VY, Kehren F, Hamill JJ, Michel C. Application of discrete data consistency conditions for selecting regularization parameters in PET attenuation map reconstruction. Phys Med Biol 2004; 49:2425-36. [PMID: 15248587 DOI: 10.1088/0031-9155/49/11/021] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Simultaneous emission and transmission measurement is appealing in PET due to the matching of geometrical conditions between emission and transmission and reduced acquisition time for the study. A potential problem remains: when transmission statistics are low, attenuation correction could be very noisy. Although noise in the attenuation map can be controlled through regularization during statistical reconstruction, the selection of regularization parameters is usually empirical. In this paper, we investigate the use of discrete data consistency conditions (DDCC) to optimally select one or two regularization parameters. The advantages of the method are that the reconstructed attenuation map is consistent with the emission data and that it accounts for particularity in the emission reconstruction algorithm and acquisition geometry. The methodology is validated using a computer-generated whole-body phantom for both emission and transmission, neglecting random events and scattered radiation. MAP-TR was used for attenuation map reconstruction, while 3D OS-EM is used for estimating the emission image. The estimation of regularization parameters depends on the resolution of the emission image controlled by the number of iterations in OS-EM. The computer simulation shows that, on one hand, DDCC regularized attenuation map reduces propagation of the transmission scan noise to the emission image, while on the other hand DDCC prevents excessive attenuation map smoothing that could result in resolution mismatch artefacts between emission and transmission.
Collapse
|
12
|
Ahn S, Fessler JA. Emission image reconstruction for randoms-precorrected PET allowing negative sinogram values. IEEE TRANSACTIONS ON MEDICAL IMAGING 2004; 23:591-601. [PMID: 15147012 DOI: 10.1109/tmi.2004.826046] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
Most positron emission tomography (PET) emission scans are corrected for accidental coincidence (AC) events by real-time subtraction of delayed-window coincidences, leaving only the randoms-precorrected data available for image reconstruction. The real-time randoms precorrection compensates in mean for AC events but destroys the Poisson statistics. The exact log-likelihood for randoms-precorrected data is inconvenient, so practical approximations are needed for maximum likelihood or penalized-likelihood image reconstruction. Conventional approximations involve setting negative sinogram values to zero, which can induce positive systematic biases, particularly for scans with low counts per ray. We propose new likelihood approximations that allow negative sinogram values without requiring zero-thresholding. With negative sinogram values, the log-likelihood functions can be nonconcave, complicating maximization; nevertheless, we develop monotonic algorithms for the new models by modifying the separable paraboloidal surrogates and the maximum-likelihood expectation-maximization (ML-EM) methods. These algorithms ascend to local maximizers of the objective function. Analysis and simulation results show that the new shifted Poisson (SP) model is nearly free of systematic bias yet keeps low variance. Despite its simpler implementation, the new SP performs comparably to the saddle-point model which has shown the best performance (as to systematic bias and variance) in randoms-precorrected PET emission reconstruction.
Collapse
Affiliation(s)
- Sangtae Ahn
- Electrical Engineering and Computer Science Department, University of Michigan, Ann Arbor, MI 48109-2122, USA.
| | | |
Collapse
|
13
|
Yu DF, Fessler JA. Edge-preserving tomographic reconstruction with nonlocal regularization. IEEE TRANSACTIONS ON MEDICAL IMAGING 2002; 21:159-173. [PMID: 11929103 DOI: 10.1109/42.993134] [Citation(s) in RCA: 42] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
Tomographic image reconstruction using statistical methods can provide more accurate system modeling, statistical models, and physical constraints than the conventional filtered backprojection (FBP) method. Because of the ill posedness of the reconstruction problem, a roughness penalty is often imposed on the solution to control noise. To avoid smoothing of edges, which are important image attributes, various edge-preserving regularization methods have been proposed. Most of these schemes rely on information from local neighborhoods to determine the presence of edges. In this paper, we propose a cost function that incorporates nonlocal boundary information into the regularization method. We use an alternating minimization algorithm with deterministic annealing to minimize the proposed cost function, jointly estimating region boundaries and object pixel values. We apply variational techniques implemented using level-sets methods to update the boundary estimates; then, using the most recent boundary estimate, we minimize a space-variant quadratic cost function to update the image estimate. For the positron emission tomography transmission reconstruction application, we compare the bias-variance tradeoff of this method with that of a "conventional" penalized-likelihood algorithm with local Huber roughness penalty.
Collapse
Affiliation(s)
- Daniel F Yu
- Electrical Engineering and Computer Science Department, University of Michigan, Ann Arbor 48109-2122, USA
| | | |
Collapse
|
14
|
La Rivière PJ, Pan X. Nonparametric regression sinogram smoothing using a roughness-penalized Poisson likelihood objective function. IEEE TRANSACTIONS ON MEDICAL IMAGING 2000; 19:773-786. [PMID: 11055801 DOI: 10.1109/42.876303] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
We develop and investigate an approach to tomographic image reconstruction in which nonparametric regression using a roughness-penalized Poisson likelihood objective function is used to smooth each projection independently prior to reconstruction by unapodized filtered backprojection (FBP). As an added generalization, the roughness penalty is expressed in terms of a monotonic transform, known as the link function, of the projections. The approach is compared to shift-invariant projection filtering through the use of a Hanning window as well as to a related nonparametric regression approach that makes use of an objective function based on weighted least squares (WLS) rather than the Poisson likelihood. The approach is found to lead to improvements in resolution-noise tradeoffs over the Hanning filter as well as over the WLS approach. We also investigate the resolution and noise effects of three different link functions: the identity, square root, and logarithm links. The choice of link function is found to influence the resolution uniformity and isotropy properties of the reconstructed images. In particular, in the case of an idealized imaging system with intrinsically uniform and isotropic resolution, the choice of a square root link function yields the desirable outcome of essentially uniform and isotropic resolution in reconstructed images, with noise performance still superior to that of the Hanning filter as well as that of the WLS approach.
Collapse
MESH Headings
- Algorithms
- Anisotropy
- Artifacts
- Humans
- Image Processing, Computer-Assisted/methods
- Image Processing, Computer-Assisted/statistics & numerical data
- Likelihood Functions
- Models, Statistical
- Phantoms, Imaging/statistics & numerical data
- Poisson Distribution
- Regression Analysis
- Reproducibility of Results
- Statistics, Nonparametric
- Tomography, Emission-Computed/methods
- Tomography, Emission-Computed/statistics & numerical data
- Tomography, Emission-Computed, Single-Photon/methods
- Tomography, Emission-Computed, Single-Photon/statistics & numerical data
Collapse
Affiliation(s)
- P J La Rivière
- Department of Radiology, The University of Chicago, IL 60637, USA
| | | |
Collapse
|