Objective Functions for Tomographic Reconstruction from. Randoms-Precorrected PET Scans. gram separately, this process doubles the storage space for

Similar documents
Scan Time Optimization for Post-injection PET Scans

IEEE TRANSACTIONS ON MEDICAL IMAGING, VOL. 23, NO. 5, MAY

Statistical Tomographic Image Reconstruction Methods for Randoms-Precorrected PET Measurements

Uniform Quadratic Penalties Cause Nonuniform Spatial Resolution

Analytical Approach to Regularization Design for Isotropic Spatial Resolution

Variance Images for Penalized-Likelihood Image Reconstruction

Noise Analysis of Regularized EM for SPECT Reconstruction 1

SPATIAL RESOLUTION PROPERTIES OF PENALIZED WEIGHTED LEAST-SQUARES TOMOGRAPHIC IMAGE RECONSTRUCTION WITH MODEL MISMATCH

Two-Material Decomposition From a Single CT Scan Using Statistical Image Reconstruction

Continuous State MRF s

Robust Maximum- Likelihood Position Estimation in Scintillation Cameras

Penalized Weighted Least-Squares Image Reconstruction for Positron Emission Tomography

Neutron and Gamma Ray Imaging for Nuclear Materials Identification

MAP Reconstruction From Spatially Correlated PET Data

CT-PET calibration : physical principles and operating procedures F.Bonutti. Faustino Bonutti Ph.D. Medical Physics, Udine University Hospital.

Sparsity Regularization for Image Reconstruction with Poisson Data

L vector that is to be estimated from a measurement vector

Lossless Compression of Dynamic PET Data

Measurement of material uniformity using 3-D position sensitive CdZnTe gamma-ray spectrometers

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Bayesian Image Reconstruction for Transmission Tomography Using Deterministic Annealing

Maximum-Likelihood Deconvolution in the Spatial and Spatial-Energy Domain for Events With Any Number of Interactions

AN NONNEGATIVELY CONSTRAINED ITERATIVE METHOD FOR POSITRON EMISSION TOMOGRAPHY. Johnathan M. Bardsley

Chapter 2 PET Imaging Basics

Lossless Compression of List-Mode 3D PET Data 1

Covariance 1. Abstract. lower bounds that converges monotonically to the CR bound with exponential speed of convergence. The

Oslo Cyclotron Laboratory

Positron Emission Tomography

Reconstruction for Proton Computed Tomography: A Monte Carlo Study

PET. Technical aspects

MASSACHUSETTS INSTITUTE OF TECHNOLOGY PHYSICS DEPARTMENT

A Subspace Approach to Estimation of. Measurements 1. Carlos E. Davila. Electrical Engineering Department, Southern Methodist University

Scan Time Optimization for Post-injection PET Scans

Optimization transfer approach to joint registration / reconstruction for motion-compensated image reconstruction

A general theory of discrete ltering. for LES in complex geometry. By Oleg V. Vasilyev AND Thomas S. Lund

ECE521 week 3: 23/26 January 2017

In Chapter 2, some concepts from the robustness literature were introduced. An important concept was the inuence function. In the present chapter, the

Superiorized Inversion of the Radon Transform

A Study of Numerical Algorithms for Regularized Poisson ML Image Reconstruction

Relative Irradiance. Wavelength (nm)

A Computational Framework for Total Variation-Regularized Positron Emission Tomography

Statistics. Lecture 2 August 7, 2000 Frank Porter Caltech. The Fundamentals; Point Estimation. Maximum Likelihood, Least Squares and All That

Computational Methods Short Course on Image Quality

dn(t) dt where λ is the constant decay probability per unit time. The solution is N(t) = N 0 exp( λt)

Radionuclide Imaging MII Positron Emission Tomography (PET)

Performance of Low Density Parity Check Codes. as a Function of Actual and Assumed Noise Levels. David J.C. MacKay & Christopher P.

Managing Uncertainty

Primer on statistics:

Explicit approximations to estimate the perturbative diusivity in the presence of convectivity and damping (Part 1): Semi-innite slab approximations

System Modeling for Gamma Ray Imaging Systems

1. Abstract. 2. Introduction/Problem Statement

Average Reward Parameters

A Monte Carlo Study of the Relationship between the Time. Structures of Prompt Gammas and in vivo Radiation Dose in.

Foundations of Modern Physics by Tipler, Theory: The dierential equation which describes the population N(t) is. dn(t) dt.

Iterations

Inversion Base Height. Daggot Pressure Gradient Visibility (miles)

Introduction to SPECT & PET TBMI02 - Medical Image Analysis 2017

Automated Segmentation of Low Light Level Imagery using Poisson MAP- MRF Labelling

p(z)

ANALYSIS OF p-norm REGULARIZED SUBPROBLEM MINIMIZATION FOR SPARSE PHOTON-LIMITED IMAGE RECOVERY

TRADEOFFS AND LIMITATIONS IN STATISTICALLY BASED IMAGE RECONSTRUCTION PROBLEMS

Statistical Data Analysis Stat 3: p-values, parameter estimation

The Bias-Variance dilemma of the Monte Carlo. method. Technion - Israel Institute of Technology, Technion City, Haifa 32000, Israel

Statistical Methods for Image Reconstruction. Image Reconstruction Methods (Simplified View) Jeffrey A. Fessler

Introduction Wavelet shrinage methods have been very successful in nonparametric regression. But so far most of the wavelet regression methods have be

Neural Network Approach for Photon-counting Detection The First Step: PPE Correction

Noise properties of motion-compensated tomographic image reconstruction methods

DEVIL PHYSICS THE BADDEST CLASS ON CAMPUS IB PHYSICS

PhD Thesis. Nuclear processes in intense laser eld. Dániel Péter Kis. PhD Thesis summary

(a) Mono-absorber. (b) 4-segmented absorbers. (c) 64-segmented absorbers

Machine learning: Hypothesis testing. Anders Hildeman

On the optimal block size for block-based, motion-compensated video coders. Sharp Labs of America, 5750 NW Pacic Rim Blvd, David L.

National Accelerator Laboratory

ATL-DAQ /09/99

Count. Count. 200mm. PMT scintillation. PMT flourescence. PMT fluorescence. PMT fluorescence. PMT fluorescence. 50mm. 42mm

Statistical Data Analysis

Use of the likelihood principle in physics. Statistics II

Supplementary Note on Bayesian analysis

Inference in non-linear time series

Nuclear Reactions A Z. Radioactivity, Spontaneous Decay: Nuclear Reaction, Induced Process: x + X Y + y + Q Q > 0. Exothermic Endothermic

Maximum Likelihood Estimation

Learning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract

Abstract It is increasingly important to structure signal processing algorithms and systems to allow for trading o between the accuracy of results and

Relaxed linearized algorithms for faster X-ray CT image reconstruction

Physics 6720 Introduction to Statistics April 4, 2017

size within a local area. Using the image as a matched lter, a nite number of scales are searched within a local window to nd the wavelet coecients th

Investigation of the Minimum Detectable Activity Level of a Preclinical LSO PET Scanner

EM Algorithms. September 6, 2012

Laplace-distributed increments, the Laplace prior, and edge-preserving regularization

Fermilab FERMILAB-Conf-00/342-E CDF January 2001

6: Positron Emission Tomography

Optimal Mean-Square Noise Benefits in Quantizer-Array Linear Estimation Ashok Patel and Bart Kosko

A Time-Averaged Projection Matrix Approach to Motion Compensation

Theoretical study of lesion detectability of MAP reconstruction using computer observers

MAXIMIZING SIGNAL-TO-NOISE RATIO (SNR) IN 3-D LARGE BANDGAP SEMICONDUCTOR PIXELATED DETECTORS IN OPTIMUM AND NON-OPTIMAL FILTERING CONDITIONS

Development of a High Precision Axial 3-D PET for Brain Imaging

arxiv: v1 [physics.data-an] 2 Mar 2011

2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n

[This is not an article, chapter, of conference paper!]

Chapter 1: A Brief Review of Maximum Likelihood, GMM, and Numerical Tools. Joan Llull. Microeconometrics IDEA PhD Program

Transcription:

Objective Functions for Tomographic Reconstruction from Randoms-Precorrected PET Scans Mehmet Yavuz and Jerey A. Fessler Dept. of EECS, University of Michigan Abstract In PET, usually the data are precorrected for accidental coincidence (AC) events by real-time subtraction of the delayed window coincidences. Randoms subtraction compensates in mean for AC events but destroys the Poisson statistics. Furthermore, for transmission tomography the weighted least-squares (WLS) method leads to systematic biases, especially at low count rates. We propose a new \shifted" Poisson (SP) model for precorrected PET data, which properly matches the rst and second order moments of the measurement statistics. Using simulations and analytic approximations, we show that estimators based on the \ordinary" Poisson (OP) model for the precorrected data lead to higher standard deviations than the proposed method. Moreover, if one zero-thresholds the data before applying the maximization algorithm, the OP model results in systematic bias. It is shown that the proposed SP model leads to penalized-likelihood estimates free of systematic bias, even for zero-thresholded data. The proposed SP model does not increase the computation requirements compared to OP model and it is robust to errors in the estimates of the AC event rates. I. Introduction Accidental coincidence (AC) events are a primary source of background noise in PET measurements. AC events occur when photons that arise from separate annihilations are mistakenly registered as having arisen from the same annihilation. In transmission scans the photons that originate from dierent transmission sources (rod or sector sources rotating around the patient) cause AC events. Due to rod masking, ratio of these AC events to \true" events are usually small in transmission scans compared to emission scans. However the eect of AC events is signicant for regions of high attenuation coecients, because projections through the regions of high attenuation coecients result in low true coincidence rates. These low count rates can become comparable to AC rates. In a conventional PET scan, the data are precorrected for AC events by real-time subtraction of the delayed-window coincidences. Real-time subtraction of delayed window coincidences [6] compensates in mean for AC events but destroys the Poisson statistics. To avoid this problem, one needs to maintain the transmission and randoms measurements as two separate sinograms [7]. However even if a PET system enables one to collect randoms (delayed coincidences) sino- This work was supported in part by NIH grants CA-6711 and CA-54362. gram separately, this process doubles the storage space for the acquired data. So in practice most PET centers collect and archive only the randoms precorrected data. Although our analysis and proposed model applies to both emission and transmission tomography, in this paper we focus on transmission tomography. We argue that for transmission scans, the WLS method and the ML method based on ordinary Poisson (OP) model lead to systematic bias and higher variance, respectively. Thus, we propose a \shifted" Poisson (SP) model which matches both the rst and second-order moments of the model to the underlying statistics of the precorrected data. The corresponding loglikelihood function is shown to have better agreement with the exact log-likelihood function than the WLS and OP objective functions. We performed 2D simulations which showed that the proposed SP model yields lower variance in the reconstructed images than the OP model. Another observation was that the WLS method leads to unacceptably high systematic bias, especially for low count rates. Lastly, we investigated the eect of using estimates of AC rates. The SP estimator is found to be robust to the errors in the estimates of AC rates. The SP model has similar computation requirements for the maximization algorithm as that of the OP model. II. Measurement Model In a conventional PET scan, the data are precorrected for AC events by real-time subtraction of the delayed-window coincidences [6]. The system detects coincidence events during two time windows: \prompt" window and \delayed" window. For each coincidence event in the prompt window, the corresponding sinogram bin is incremented. The statistics of these increments should be well approximated by a Poisson process. However, for coincidence events within the second delayed window, the corresponding sinogram bin is decremented, so the resultant \precorrected" events are not Poisson. Since prompt events and delayed events are independent Poisson processes, the precorrected measurements correspond to the dierence of two independent Poisson random variables with variance equal to the sum of the means of the two random variables. In other words, randoms subtraction compensates in mean for AC events, but it also increases the variance of the measurement by an amount equal to the mean of AC events. Let y = [y 1 ; : : : ; y N ] denote the vector of precorrected transmission scan measurements. The precorrected measurement for the nth coincidence detector pair is: y n = y n;p y n;d (1)

where y n;p and y n;d are the number of coincidences within the prompt and delayed windows, respectively. Let = [ 1 ; : : : ; p ] denote the vector of unknown linear attenuation coecients. We assume that y n;p and y n;d are statistically independent realizations of the random variables fy n;p g N and fy n;d g N having Poisson distributions with means y n;p and y n;d respectively as: y n;p () = b n e ln() + r n (2) y n;d = r n (3) where l n () = P p j=1 a nj j is the total attenuation between nth detector pair. The a nj factors have units of length and describe the tomographic system geometry. The b n > factors denote the blank scan counts and the r n factors denote the mean of AC events. Since y n;p and y n;d are statistically independent: Efy n g = y n = y n;p () y n;d = b n e ln () ; Varfy n g = y n;p () + y n;d = b n e ln() + 2r n : Thus the precorrected measurements (y n 's) are clearly not Poisson distributed. One simple approach to image reconstruction would be to assume that the measurements have a Poisson distribution with means y n (), even though this model is incorrect we refer to this approach as the \ordinary Poisson" model. To illustrate the inaccuracy of the ordinary Poisson measurement model for y n 's, we have performed a small Monte Carlo simulation similar to [2]. The circles in Fig. 1 show a simulated histogram for y n generated by a pseudo-random number generator in accordance with the distribution described above (for 5, realizations) where y n;p = 8 and y n;d = r n = 1 (corresponding to 12.5 percent randoms)..15.1.5.15.5 Gaussian Model 5 5 1 15 2.1.15.5 Ordinary Poisson Model 5 5 1 15 2.1 Shifted Poisson Model 5 5 1 15 2 Counts Fig. 1. Comparison of a) Gaussian, b) ordinary Poisson and c) shifted Poisson models (-) (with the moments matched to the moments of precorrected measurements), with the empirical distribution (o) of precorrected measurements. Fig. 1a shows the approximation based on Gaussian distribution model with mean (y n ) and variance (y n + 2r n ). Fig. 1b shows the ordinary Poisson (OP) model where approximation is based on a Poisson model with mean (y n ), the ideal mean. Lastly, Fig. 1c shows the approximation based on a Poisson model with mean (y n + 2r n ) and then shifted by 2r n. The resultant approximation corresponds to a model with mean and variance that match both rst and second order moments of y n. The last approximation corresponds to our proposed \shifted" Poisson (SP) model and it has the best agreement with the precorrected measurement y n. For large means, the shifted Poisson distribution is also approximately Gaussian by the Central Limit Theorem. However in transmission tomography the the projections through high attenuation regions (which is usually the region of interest) have lower count rates, and the above example illustrates that for low count rates the Gaussian approximation is less accurate than the SP model. III. Objective Functions Let y n given in (1) be a realization of statistically independent random variables fy n g N. The exact loglikelihood for can be formulated using total : L() = log P (Y = y; ) = log @ X 1 m=b ync+ y n;p () (yn+m) (y n + m)! y (m) n;d m! 1 A (y n;p () + y n;d ): (4) This exact log-likelihood function contains innite summations, so practically one cannot compute the exact value. In the light of the Monte Carlo simulation that we have performed in previous section, one can develop dierent approximations to the exact log-likelihood function. We describe three approximations below. The quadratic approximation to the exact log-likelihood function results in the Weighted Least Squares objective function L W LS () [8]: L W LS () = 1 2 ; yn> (l n () ^l n ) 1 2 ; (5) ^ n 2 where ^l n = log (b n =y n ) is the method-of-moments estimate of the line integral of the attenuation l n () and the nth weighting factor ^ 2 n = (y n + 2r n )=y 2 n is an estimate of the variance of ^l n (y n ) based on a second-order Taylor expansion around ^l n (y n ). This weighting is critical for the WLS method. The errors corresponding to projections with large values of y n are weighted more heavily. These projections pass through less dense objects and consequently have higher SNR values. The ordinary Poisson model for the precorrected data y n with mean y n () = b n e ln() leads to the OP objective function: L OP () = y n log(b n e ln() ) (b n e ln() ); (6)

disregarding constants independent of. In the light of Fig. 1c, a better approach, which matches both the rst and the second order moments 2, is to approximate the quantities (y n + 2r n ) as realizations of independent Poisson random variables with means (y n () + 2r n ). This model leads to our proposed SP objective function: where L SP () = h n (l n ()); (7) h n (l) = (y n + 2r n ) log(b n e l + 2r n ) (b n e l + 2r n ): (It can be shown that L W LS () corresponds to the summations of second order Taylor series expansion of h n (l n ()) about h n (^l n ) where ^l n = log (b n =y n ).) Fig. 2 compares the actual log-likelihood function and the approximations for noiseless data as a function of a single projection across the reconstructed image for 5% randoms rate. L SP () agrees fairly well with the exact log-likelihood L(); however L W LS () and L OP () depart signicantly from the exact log-likelihood function. Note that all the curves have the maximum at the same point ^l n = log (b n =y n ). This is due to the fact that all estimators works perfectly with the noiseless data (i:e: y n = y n ). For noisy data the maximum of each curve will exhibit a variation around its mean value. We also observed that, for noisy data L SP () agrees with the exact log-likelihood L() better than the other models. Log Likelihood.5 1 1.5 2 2.5 3 3.5 4 4.5 Exact Log Likelihood + Quadratic Model * Poisson Model o Shifted Poisson Model 5.5 1 1.5 2 2.5 3 3.5 4 Projection Density (l) Fig. 2. Comparison of exact log-likelihood function with objective functions of dierent models for a single projection ray. The proposed shifted Poisson model agrees with exact log-likelihood better than the other models. 2 Key Dierence: Both LW LS and LSP match two moments, but in WLS the 2 nd moment is \xed" and it is the moments of ^ln rather than yn. IV. Bias-Variance Analysis To analyze the bias and variance of each estimator analytically, we used the analytic approximations in [4]. For this purpose we considered a highly simplied version of transmission tomography where the unknown is a scalar parameter. This simplied problem provides insight into the estimator bias and variance without the undue notation of the multi-parameter case. Because of the space considerations we are not able to give detailed formulas for mean and variance approximations of dierent estimators. Using Cauchy-Schwarz inequality to the approximations of variances, we have shown analytically that SP estimator yields a lower variance than the OP estimator [9]. This result agrees with the 2-D simulations shown next. A. 2D Simulations To study bias and variance properties of each estimator described above, we performed 2D simulations. For we used the synthetic attenuation map shown in Fig. 3, which represents a human abdomen with linear attenuation coecient :96=mm. The image was a 128 by 64 array of 4.5 mm pixels. We simulated a PET transmission scan with 192 radial bins and 256 angles uniformly spaced over 18 degrees. The a nj factors correspond to 6 mm wide strip integrals on 3 mm center-to-center spacing. The b n factors were generated using pseudo-random log-normal variates with standard deviation of.3 to account P for detector eciency variations, and scaled so that n y n was one million counts. The r n factors corresponded to a uniform eld of 5% random coincidences. Pseudo-random transmission measurements were generated according to (2) and (3). For regularization, we used the modied quadratic penalty of [3], which improves the spatial resolution uniformity. Fig. 3. Simulated abdomen attenuation map. We generated 1 independent realizations of the transmission measurements. For each measurement realization, an estimate of the attenuation map was reconstructed, using 2 iterations of the grouped-coordinate ascent algorithm [5] applied to the objective functions (5), (6) and (7). We computed both the sample mean and sample standard deviation images for all three methods. Fig. 4 shows horizontal proles through the sample mean images. These proles show that WLS is systematically negatively biased, whereas the OP and SP models are free of systematic bias. (One should note that the overshoot at the edges is due to the quadratic penalty used in the reconstruction. Even with noiseless data, this blurring eect will still be present.) To study the standard deviation, we calculated the ratio of sample standard deviation images of dierent estima-

tors. Fig. 5 shows the histogram of the ratio of standard deviations, over all interior pixels. The OP model yields about 2% higher standard deviation than the SP model. In other words, to achieve the same noise level, the OP method would require about 4% greater scan time than our proposed SP method. Again the OP model yields, on the average 11%, higher standard deviation than SP model..12 Profile through means from 1 realizations Fig. 6. Simulated thorax attenuation map..1 Attenuation Coefficient[1/mm].8.6.4.2 Ordinary Poisson model Shifted Poisson model WLS 1 2 3 4 5 6 7 Pixels Attenuation Coefficient[1/mm].12.1.8.6.4 Profile through means from 1 realizations Ordinary Poisson model Shifted Poisson model WLS.2 Fig. 4. The WLS method has a systematic negative bias. The ordinary Poisson and the shifted Poisson models yield negligible bias. 5 1 15 Pixels 3 Histogram of ratio of standard deviations Fig. 7. The WLS method has a systematic negative bias. The ordinary Poisson and the shifted Poisson models yield negligible bias. 25 2 15 1 5 1 1.5 1.1 1.15 1.2 1.25 1.3 1.35 1.4 Ratio of standard deviation of ordinary Poisson estimator to shifted Poisson estimator Fig. 5. Ordinary Poisson model yields, on the average 2%, higher standard deviation than proposed shifted Poisson model. We performed additional simulations for the same PET system described above, but this time using the synthetic attenuation map shown in Fig. 6, which represents a human thorax with linear attenuation coecients.165/mm,.96/mm, and.25/mm, for bone, soft tissue, and lungs, respectively. Fig. 7 shows the proles through the sample mean images. Similar to the observations from Fig. 4, the WLS estimator is negatively biased. Fig. 8 shows the histogram of the ratio of standard deviations. Note that the process of real-time subtraction of the delayed coincidence events from prompt events can lead to some negative values in the precorrected data. Since the mean of precorrected measurements is nonnegative, a natural choice might be to threshold the negative values in the precorrected data to zero before applying the maximization algorithm. (Moreover the likelihood objective function for emission tomography is not guaranteed to be concave if the measurements have negative values.) To study further the eects of zero-thresholding the data, we performed additional 2D simulations with zero-thresholded data and using the above phantoms [9]. The results have shown that the OP estimator was systematically negatively biased, especially for interior regions of the reconstructed image, and still had higher standard deviation than the SP estimator. B. Estimates of the AC rates (^r n ) One needs to know the mean of the AC events (r n ) in order to compute L SP (). Since the r n terms are not readily available from the real (precorrected) data, some estimates of the randoms must be used. Fig. 9 displays the scatter plot of real delayed coincidence sinograms for blank scan and transmission scan data. Each point in the plot corresponds to a specic detector pair. The similarity of both delayed coincidence measurements suggests that one can simultaneously acquire the delayed

3 25 2 15 1 5 Histogram of ratio of standard deviations 1 1.5 1.1 1.15 1.2 1.25 Ratio of standard deviation of ordinary Poisson estimator to shifted Poisson estimator Fig. 8. Ordinary Poisson model yields, on the average 12%, higher standard deviation than shifted Poisson model. coincidence events during the blank scan and use it (after properly normalizing for dierent scan durations) as an estimate of the AC rates for dierent transmission scans performed on the same PET system. For emission tomography the \singles" method suggested by Casey [1] can be used to obtain an estimate of the rate of AC events. Transmission Delayed Event Rate 15 1 5 5 1 15 Blank Delayed Event Rate Fig. 9. Scatter plot of delayed coincidence event of blank and transmission scans. To test the robustness of the SP estimator to the errors in estimates of AC rates, we performed additional simulations using the phantoms and the PET system described previously. P We observed that, even using a constant N value of r = (1=N) n r n, as an estimate of the AC event rates did not introduce any systematic bias for SP estimator and increased standard deviation only slightly (around 2%). Since the AC rates of transmission and blank scans are highly correlated, using AC rates obtained from blank scan measurements should yield better AC rates estimates, resulting in very similar bias/variance performance as the estimators obtained by using true AC rates. V. Conclusions When the AC events are precorrected in PET, the measurement statistics are no longer Poisson. For transmission scans, WLS method and ML method based on ordinary Poisson (OP) model lead to systematic bias and higher variance, respectively. Thus, we proposed a shifted Poisson (SP) model for measurement statistics which matches both the rst and second-order moments. The corresponding log-likelihood function was shown to have better agreement with the exact log-likelihood function than the other methods above. Using Taylor approximation, implicit function theorem and chain rule one can obtain analytic expressions for the mean and the variance of the dierent estimators and using Cauchy-Schwarz inequality, it can be shown analytically that OP model leads to higher variance than both the WLS and the SP methods [9]. We performed 2D simulations to support this observation. Another observation was that the WLS method leads to unacceptably high systematic bias, especially for low count rates. We have also shown that the eect of zero-thresholding the precorrected data leads to systematic negative bias for the OP model estimator. In addition, OP estimator still had a higher variance than the SP estimator [9]. Lastly, the eect of using estimates of AC rates, for the SP estimator was investigated. We observed that the SP model is robust to errors in the estimates of AC events. Namely, even using constant AC rates resulted in only a slight increase in the standard deviation without any systematic bias [9]. It should be noted that the SP model does not increase the computation requirements for the maximization algorithm over that of the OP model. References [1] M E Casey and E J Homan. Quantitation in positron emission computed tomography: 7 A technique to reduce noise in accidental coincidence measurements and coincidence eciency calibration. J. Comp. Assisted Tomo., 1(5):845{85, 1986. [2] J A Fessler. Penalized weighted least-squares image reconstruction for positron emission tomography. IEEE Tr. Med. Im., 13(2):29{3, June 1994. [3] J A Fessler. Resolution properties of regularizedimage reconstruction methods. Technical Report 297, Comm. and Sign. Proc. Lab., Dept. of EECS, Univ. of Michigan, Ann Arbor, MI, 4819-2122, August 1995. [4] J A Fessler. Mean and variance of implicitly dened biased estimators (such as penalized maximum likelihood): Applications to tomography. IEEE Tr. Im. Proc., 5(3):493{56, March 1996. [5] J A Fessler, E P Ficaro, N H Clinthorne, and K Lange. Groupedcoordinate ascent algorithms for penalized-likelihood transmission image reconstruction. IEEE Tr. Med. Im., 1996. To appear. [6] E J Homan, S C Huang, M E Phelps, and D E Kuhl. Quantitation in positron emission computed tomography: 4 Eect of accidental coincidences. J. Comp. Assisted Tomo., 5(3):391{4, 1981. [7] E U Mumcuoglu, R M Leahy, and S R Cherry. Bayesian reconstruction of PET images: methodology and performance analysis. Phys. Med. Biol., 1996. [8] K Sauer and C Bouman. A local update strategy for iterative reconstruction from projections. IEEE Tr. Sig. Proc., 41(2):534{ 548, February 1993. [9] M Yavuz and J A Fessler. Objective functions and reconstruction algorithms for randoms-precorrected PET scans, 1996. In preparation.