Multiresolution Approach to Acceleration of Iterative Image Reconstruction for XRay Imaging for Security Applications
Abstract
Threedimensional xray CT image reconstruction in baggage scanning in security applications is an important research field. The variety of materials to be reconstructed is broader than medical xray imaging. Presence of high attenuating materials such as metal may cause artifacts if analytical reconstruction methods are used. Statistical modeling and the resultant iterative algorithms are known to reduce these artifacts and present good quantitative accuracy in estimates of linear attenuation coefficients. However, iterative algorithms may require computations in order to achieve quantitatively accurate results. For the case of baggage scanning, in order to provide fast accurate inspection throughput, they must be accelerated drastically. There are many approaches proposed in the literature to increase speed of convergence. This paper presents a new method that estimates the wavelet coefficients of the images in the discrete wavelet transform domain instead of the image space itself. Initially, surrogate functions are created around approximation coefficients only. As the iterations proceed, the wavelet tree on which the updates are made is expanded based on a criterion and detail coefficients at each level are updated and the tree is expanded this way. For example, in the smooth regions of the image the detail coefficients are not updated while the coefficients that represent the highfrequency component around edges are being updated, thus saving time by focusing computations where they are needed. This approach is implemented on real data from a SureScan^{TM} x1000 Explosive Detection System^{1}^{1}1SureScan^{TM} is a trademark of the SureScan Corporation. and compared to straightforward implementation of the unregularized alternating minimization of O’Sullivan and Benac [1].
I Summary
Xray CT image reconstruction algorithms often are designed to tradeoff a data fit term and an image roughness term. Typical examples of data fit terms are squared error[3] and loglikelihood. Typical examples of image roughness terms are total variation, GaussMarkov random field priors[4], and Huber class roughness penalties. A complementary approach for penalizing roughness is to represent the image using a wavelet (or other multiresolution) expansion, directly estimate the wavelet coefficients, and introduce a penalty (often ) on the wavelet coefficients[5, 6]. Multigrid approach[7, 8, 9, 10] has been shown to be successful in some image reconstruction methods in terms of achieving faster convergence speed. The idea is to move through different grid levels over time. At any grid level, the voxel size is the same throughout the image domain. The approach we describe in this paper is closest to this with the following differences:

Our data fit term is a Poisson loglikelihood with mean determined by Beer’s law

We use an alternating minimization framework to derive an algorithm that is guaranteed to decrease the cost function at every iteration

We extend the prior alternating minimization framework to point spread functions with negative values

We update only a subset of wavelet coefficients, constrained to be on a tree, thereby decreasing the computational complexity per iteration relative to fully updating the image

We adaptively update the tree defining which wavelet coefficients are updated at each iteration

Our wavelet tree structure results in image domain representation of voxels with different sizes

We incorporate an adaptive threshold allowing the computation of a sequence of images of increasing resolution, with increasing roughness and increasing loglikelihood
The result is a fast, adaptive, iterative image reconstruction algorithm.
Ii Introduction
The problem to be minimized for xray CT in this paper is penalized likelihood estimation with a Poisson loglikelihood data fitting term and a regularization term, optimized over image . It is shown in [1] that maximization of the Poisson loglikelihood term is equivalent to minimizing the Idivergence^{2}^{2}2Idivergence between two vectors is defined as . between the transmission data and the estimated mean , where , is the incident photon count vector, is an element of the system matrix that represents the length of the intersection between the ray path of index and voxel of index . Then this penalized likelihood estimation problem can be formulated as [11]
(1) 
where is a regularization term selected as a roughness penalty and is the parameter that controls the level of roughness imposed on the image. Also, it is important to note that the nonnegativity constraint on is due to the nature of linear attenuation coefficients of materials. Since there is no closed form solution to this problem, we solve it iteratively. At each iteration, a surrogate function that approximates the original objective function is minimized, which in turn decreases the original objective function. In our recent work [11], we generalized the formulation of surrogate functions in [1] for data fitting term to the regularization term. The idea is to use Jensen’s inequality to decouple the objective function and form many oneparameter convex functions, minimize them, and iterate.
Assume that there exists a discrete wavelet inverse transform matrix that is nonsingular. Then, the image can be represented as
(2) 
where is the vector of wavelet coefficients. The problem in this paper can then be written as
(3)  
Below, the derivation of the surrogate functions for the data fitting term is shown. A similar approach yields surrogate functions for the regularization term as well.
The Idivergence term can be written as
(4) 
For simplicity, define the matrix , where is the system matrix element between ray path of index and wavelet coefficient of index . Assume that there exists a known estimate and . The terms in the Idivergence that depend on are used to construct surrogate functions as follows.
(5) 
where
(6) 
the convex decomposition lemma [1] is used for , . can be chosen as
(7) 
and , .^{3}^{3}3 represents a subset of the wavelet domain to be chosen for update. In our approach, we choose it in a way that every voxel in image domain is represented at any iteration, possibly with different numbers of coefficients. This subset can be fixed or be varied over iterations.
(8) 
Adding the constant term in Idivergence, we define our surrogate function,
(9) 
It is clear to see that this is a oneparameter convex function over each and the gradient with respect to is given as:
(10)  
where
(11)  
(12) 
The firstorder necessary condition for a minimizer is to find the for which the gradient is zero, which has a closed form solution. The algorithm is shown below.
Iii Results
The multiresolution technique has been evaluated using a real data scan of the NIST Phantom Test Article A [12] acquired on a SureScan^{TM} x1000 Explosive Detection System. A two dimensional Level 3 Haar disrete wavelet transform is used to represent each zslice of the three dimensional image domain. The wavelet tree, is initialized to consist of approximation coefficients only. At iteration number , the coefficients are back projected to the image space, voxel values across zslices are summed up and the pixels whose values were larger than times the maximum of the summed image were chosen to expand one level. Then, at iteration number , the same procedure is applied with the same factor to expand one level further, and the last expansion is done at iteration number . Figure 1 shows objective function values versus time for unregularized alternating minimization algorithm (AM)[1] and unregularized wavelet AM represented in this paper. AM algorithm has been run for 100 iterations while Wavelet AM has been run for 300 iterations. Figures 2 and 3 show image slices reconstructed from two algorithms at the same objective function value level. The difference between these two images (unregularized AM image subtracted from wavelet AM image) is shown in Figure 4. It is important to note that even though two images are at the same objective function value level, the image reconstructed using wavelet AM has sharper edges.
Iv Conclusion
A fast, iterative, and adaptive algorithm for xray imaging was formulated and presented by using alternating minimization framework. The algorithm is guaranteed to decrease at each iteration and adaptive wavelet tree structure provides better utilization of computations. In other words, more computations are used for the regions with high frequency components like edges while less are used for smoother areas. The wavelet tree expansion used to reconstruct the image shown in the results section is one of many possible methods to perform it. Different ways to expand the tree will be investigated in the future. Different scale levels of discrete wavelet transform, different wavelet types and exploration of regularization are other parts to be explored later. Furthermore, this method can be combined with other acceleration methods like ordered subsets [13]. Preliminary studies combining ordered subsets and wavelet AM showed promising results and will be investigated further.
Acknowledgment
We thank Carl Bosch, Nawfel Tricha, Sean Corrigan, Michael Hollenbeck, and Assaf Mesika of SureScan^{TM} Corporation for their contributions to the collection, formatting, and sharing of the data.
References
 [1] J. A. O’Sullivan and J. Benac, “Alternating minimization algorithms for transmission tomography,” Medical Imaging, IEEE Transactions on, vol. 26, no. 3, pp. 283–297, March 2007.
 [2] K. Lange, “Convergence of EM image reconstruction algorithms with Gibbs smoothing,” Medical Imaging, IEEE Transactions on, vol. 9, no. 4, pp. 439–446, December 1990.
 [3] J.B. Thibault, K. D. Sauer, C. A. Bouman, and J. Hsieh, “A threedimensional statistical approach to improved image quality for multislice helical ct,” Medical physics, vol. 34, no. 11, pp. 4526–4544, 2007.
 [4] C. Bouman and K. Sauer, “A generalized gaussian image model for edgepreserving map estimation,” Image Processing, IEEE Transactions on, vol. 2, no. 3, pp. 296–310, 1993.
 [5] S. Ramani and J. A. Fessler, “A splittingbased iterative algorithm for accelerated statistical xray ct reconstruction,” Medical Imaging, IEEE Transactions on, vol. 31, no. 3, pp. 677–688, 2012.
 [6] Q. Xu, A. Sawatzky, M. A. Anastasio, and C. O. Schirra, “Sparsityregularized image reconstruction of decomposed kedge data in spectral ct,” Physics in medicine and biology, vol. 59, no. 10, p. N65, 2014.
 [7] J. A. O’Sullivan and J. Benac, “Alternating minimization multigrid algorithms for transmission tomography,” Proc. SPIE Computational Imaging II Conference, vol. 5299, pp. 216–221, May 2004.
 [8] T.S. Pan and A. E. Yagle, “Numerical study of multigrid implementations of some iterative image reconstruction algorithms,” Medical Imaging, IEEE Transactions on, vol. 10, no. 4, pp. 572–588, 1991.
 [9] S. Oh, A. B. Milstein, C. A. Bouman, and K. J. Webb, “A general framework for nonlinear multigrid inversion,” Image Processing, IEEE Transactions on, vol. 14, no. 1, pp. 125–140, 2005.
 [10] S. Oh, C. A. Bouman, and K. J. Webb, “Multigrid tomographic inversion with variable resolution data and image spaces,” Image Processing, IEEE Transactions on, vol. 15, no. 9, pp. 2805–2819, 2006.
 [11] S. Degirmenci, D. G. Politte, C. Bosch, N. Tricha, and J. A. O’Sullivan, “Acceleration of iterative image reconstruction for Xray imaging for security applications,” Proc. SPIE Computational Imaging XIII Conference, February 2015.
 [12] “American national standard for evaluating the image quality of xray computed tomography (CT) securityscreening systems,” ANSI N42.452011, pp. 1–58, May 2011.
 [13] H. Erdogan and J. A. Fessler, “Ordered subsets algorithms for transmission tomography,” Physics in Medicine and Biology, vol. 44, no. 11, p. 2835, 1999. [Online]. Available: http://stacks.iop.org/00319155/44/i=11/a=311