Multiresolution Approach to Acceleration of Iterative Image Reconstruction for X-Ray Imaging for Security Applications

Multiresolution Approach to Acceleration of Iterative Image Reconstruction for X-Ray Imaging for Security Applications

Soysal Degirmenci Department of Electrical and
Systems Engineering
Washington University in St. Louis
Saint Louis, Missouri 63130
   Joseph A. O’Sullivan Department of Electrical and
Systems Engineering
Washington University in St. Louis
Saint Louis, Missouri 63130
   David G. Politte Mallinckrodt Institute
of Radiology
Washington University School of Medicine
Saint Louis, Missouri 63110

Three-dimensional x-ray CT image reconstruction in baggage scanning in security applications is an important research field. The variety of materials to be reconstructed is broader than medical x-ray imaging. Presence of high attenuating materials such as metal may cause artifacts if analytical reconstruction methods are used. Statistical modeling and the resultant iterative algorithms are known to reduce these artifacts and present good quantitative accuracy in estimates of linear attenuation coefficients. However, iterative algorithms may require computations in order to achieve quantitatively accurate results. For the case of baggage scanning, in order to provide fast accurate inspection throughput, they must be accelerated drastically. There are many approaches proposed in the literature to increase speed of convergence. This paper presents a new method that estimates the wavelet coefficients of the images in the discrete wavelet transform domain instead of the image space itself. Initially, surrogate functions are created around approximation coefficients only. As the iterations proceed, the wavelet tree on which the updates are made is expanded based on a criterion and detail coefficients at each level are updated and the tree is expanded this way. For example, in the smooth regions of the image the detail coefficients are not updated while the coefficients that represent the high-frequency component around edges are being updated, thus saving time by focusing computations where they are needed. This approach is implemented on real data from a SureScanTM x1000 Explosive Detection System111SureScanTM is a trademark of the SureScan Corporation. and compared to straightforward implementation of the unregularized alternating minimization of O’Sullivan and Benac [1].

I Summary

X-ray CT image reconstruction algorithms often are designed to trade-off a data fit term and an image roughness term. Typical examples of data fit terms are squared error[3] and log-likelihood. Typical examples of image roughness terms are total variation, Gauss-Markov random field priors[4], and Huber class roughness penalties. A complementary approach for penalizing roughness is to represent the image using a wavelet (or other multiresolution) expansion, directly estimate the wavelet coefficients, and introduce a penalty (often ) on the wavelet coefficients[5, 6]. Multigrid approach[7, 8, 9, 10] has been shown to be successful in some image reconstruction methods in terms of achieving faster convergence speed. The idea is to move through different grid levels over time. At any grid level, the voxel size is the same throughout the image domain. The approach we describe in this paper is closest to this with the following differences:

  • Our data fit term is a Poisson log-likelihood with mean determined by Beer’s law

  • We use an alternating minimization framework to derive an algorithm that is guaranteed to decrease the cost function at every iteration

  • We extend the prior alternating minimization framework to point spread functions with negative values

  • We update only a subset of wavelet coefficients, constrained to be on a tree, thereby decreasing the computational complexity per iteration relative to fully updating the image

  • We adaptively update the tree defining which wavelet coefficients are updated at each iteration

  • Our wavelet tree structure results in image domain representation of voxels with different sizes

  • We incorporate an adaptive threshold allowing the computation of a sequence of images of increasing resolution, with increasing roughness and increasing log-likelihood

The result is a fast, adaptive, iterative image reconstruction algorithm.

Ii Introduction

The problem to be minimized for x-ray CT in this paper is penalized likelihood estimation with a Poisson log-likelihood data fitting term and a regularization term, optimized over image . It is shown in [1] that maximization of the Poisson log-likelihood term is equivalent to minimizing the I-divergence222I-divergence between two vectors is defined as . between the transmission data and the estimated mean , where , is the incident photon count vector, is an element of the system matrix that represents the length of the intersection between the ray path of index and voxel of index . Then this penalized likelihood estimation problem can be formulated as [11]


where is a regularization term selected as a roughness penalty and is the parameter that controls the level of roughness imposed on the image. Also, it is important to note that the non-negativity constraint on is due to the nature of linear attenuation coefficients of materials. Since there is no closed form solution to this problem, we solve it iteratively. At each iteration, a surrogate function that approximates the original objective function is minimized, which in turn decreases the original objective function. In our recent work [11], we generalized the formulation of surrogate functions in [1] for data fitting term to the regularization term. The idea is to use Jensen’s inequality to decouple the objective function and form many one-parameter convex functions, minimize them, and iterate.

Assume that there exists a discrete wavelet inverse transform matrix that is non-singular. Then, the image can be represented as


where is the vector of wavelet coefficients. The problem in this paper can then be written as


Below, the derivation of the surrogate functions for the data fitting term is shown. A similar approach yields surrogate functions for the regularization term as well.

The I-divergence term can be written as


For simplicity, define the matrix , where is the system matrix element between ray path of index and wavelet coefficient of index . Assume that there exists a known estimate and . The terms in the I-divergence that depend on are used to construct surrogate functions as follows.




the convex decomposition lemma [1] is used for , . can be chosen as


and , .333 represents a subset of the wavelet domain to be chosen for update. In our approach, we choose it in a way that every voxel in image domain is represented at any iteration, possibly with different numbers of coefficients. This subset can be fixed or be varied over iterations.


Adding the constant term in I-divergence, we define our surrogate function,


It is clear to see that this is a one-parameter convex function over each and the gradient with respect to is given as:




The first-order necessary condition for a minimizer is to find the for which the gradient is zero, which has a closed form solution. The algorithm is shown below.

for  do
     for every  do
     end for
end for
Algorithm 1 Unregularized Wavelet AM Algorithm

Iii Results

The multiresolution technique has been evaluated using a real data scan of the NIST Phantom Test Article A [12] acquired on a SureScanTM x1000 Explosive Detection System. A two dimensional Level 3 Haar disrete wavelet transform is used to represent each z-slice of the three dimensional image domain. The wavelet tree, is initialized to consist of approximation coefficients only. At iteration number , the coefficients are back projected to the image space, voxel values across z-slices are summed up and the pixels whose values were larger than times the maximum of the summed image were chosen to expand one level. Then, at iteration number , the same procedure is applied with the same factor to expand one level further, and the last expansion is done at iteration number . Figure 1 shows objective function values versus time for unregularized alternating minimization algorithm (AM)[1] and unregularized wavelet AM represented in this paper. AM algorithm has been run for 100 iterations while Wavelet AM has been run for 300 iterations. Figures 2 and 3 show image slices reconstructed from two algorithms at the same objective function value level. The difference between these two images (unregularized AM image subtracted from wavelet AM image) is shown in Figure 4. It is important to note that even though two images are at the same objective function value level, the image reconstructed using wavelet AM has sharper edges.

Fig. 1: Objective function values vs. time for AM and Wavelet AM.
Fig. 2: Image reconstructed with unregularized AM after 100 iterations.
Fig. 3: Image reconstructed with wavelet AM after 300 iterations.
Fig. 4: Difference image, unregularized AM image subtracted from wavelet AM image.

Iv Conclusion

A fast, iterative, and adaptive algorithm for x-ray imaging was formulated and presented by using alternating minimization framework. The algorithm is guaranteed to decrease at each iteration and adaptive wavelet tree structure provides better utilization of computations. In other words, more computations are used for the regions with high frequency components like edges while less are used for smoother areas. The wavelet tree expansion used to reconstruct the image shown in the results section is one of many possible methods to perform it. Different ways to expand the tree will be investigated in the future. Different scale levels of discrete wavelet transform, different wavelet types and exploration of regularization are other parts to be explored later. Furthermore, this method can be combined with other acceleration methods like ordered subsets [13]. Preliminary studies combining ordered subsets and wavelet AM showed promising results and will be investigated further.


We thank Carl Bosch, Nawfel Tricha, Sean Corrigan, Michael Hollenbeck, and Assaf Mesika of SureScanTM Corporation for their contributions to the collection, formatting, and sharing of the data.


  • [1] J. A. O’Sullivan and J. Benac, “Alternating minimization algorithms for transmission tomography,” Medical Imaging, IEEE Transactions on, vol. 26, no. 3, pp. 283–297, March 2007.
  • [2] K. Lange, “Convergence of EM image reconstruction algorithms with Gibbs smoothing,” Medical Imaging, IEEE Transactions on, vol. 9, no. 4, pp. 439–446, December 1990.
  • [3] J.-B. Thibault, K. D. Sauer, C. A. Bouman, and J. Hsieh, “A three-dimensional statistical approach to improved image quality for multislice helical ct,” Medical physics, vol. 34, no. 11, pp. 4526–4544, 2007.
  • [4] C. Bouman and K. Sauer, “A generalized gaussian image model for edge-preserving map estimation,” Image Processing, IEEE Transactions on, vol. 2, no. 3, pp. 296–310, 1993.
  • [5] S. Ramani and J. A. Fessler, “A splitting-based iterative algorithm for accelerated statistical x-ray ct reconstruction,” Medical Imaging, IEEE Transactions on, vol. 31, no. 3, pp. 677–688, 2012.
  • [6] Q. Xu, A. Sawatzky, M. A. Anastasio, and C. O. Schirra, “Sparsity-regularized image reconstruction of decomposed k-edge data in spectral ct,” Physics in medicine and biology, vol. 59, no. 10, p. N65, 2014.
  • [7] J. A. O’Sullivan and J. Benac, “Alternating minimization multigrid algorithms for transmission tomography,” Proc. SPIE Computational Imaging II Conference, vol. 5299, pp. 216–221, May 2004.
  • [8] T.-S. Pan and A. E. Yagle, “Numerical study of multigrid implementations of some iterative image reconstruction algorithms,” Medical Imaging, IEEE Transactions on, vol. 10, no. 4, pp. 572–588, 1991.
  • [9] S. Oh, A. B. Milstein, C. A. Bouman, and K. J. Webb, “A general framework for nonlinear multigrid inversion,” Image Processing, IEEE Transactions on, vol. 14, no. 1, pp. 125–140, 2005.
  • [10] S. Oh, C. A. Bouman, and K. J. Webb, “Multigrid tomographic inversion with variable resolution data and image spaces,” Image Processing, IEEE Transactions on, vol. 15, no. 9, pp. 2805–2819, 2006.
  • [11] S. Degirmenci, D. G. Politte, C. Bosch, N. Tricha, and J. A. O’Sullivan, “Acceleration of iterative image reconstruction for X-ray imaging for security applications,” Proc. SPIE Computational Imaging XIII Conference, February 2015.
  • [12] “American national standard for evaluating the image quality of x-ray computed tomography (CT) security-screening systems,” ANSI N42.45-2011, pp. 1–58, May 2011.
  • [13] H. Erdogan and J. A. Fessler, “Ordered subsets algorithms for transmission tomography,” Physics in Medicine and Biology, vol. 44, no. 11, p. 2835, 1999. [Online]. Available:
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
Add comment
Loading ...
This is a comment super asjknd jkasnjk adsnkj
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test description