A FISTA-like scheme to accelerate GISTA?

Size: px
Start display at page:

Download "A FISTA-like scheme to accelerate GISTA?"

Transcription

1 A FISTA-like scheme to accelerate GISTA? C. Cloquet 1, I. Loris 2, C. Verhoeven 2 and M. Defrise 1 1 Dept. of Nuclear Medicine, Vrije Universiteit Brussel 2 Dept. of Mathematics, Université Libre de Bruxelles MIMS seminar Manchester, January 11, ccloquet@vub.ac.be, igloris@ulb.ac.be, cverhoev@ulb.ac.be, mdefrise@vub.ac.be tiny.cc/cloquet

2 Cone Beam VUB [Philips Brightview XCT]

3 Cone Beam VUB [Philips Brightview XCT] Cone-Beam Compute newscenter.philips.com W. Scarfe et al., J Can Dent Assoc 2006, 72(1):75-80 Figure 1: X-ray beam projection scheme comparing a single detector array fan-beam CT (a) and cone-beam CT (b) geometry. F c c (l

4 Cone Beam VUB [Bruker Skyscan microct 1178]

5 Challenges CT % of the cancers in the US caused by CT studies? [Brenner and Hall, NEJM, 2007] lower the dose, ie : do more with less

6 Challenges CT % of the cancers in the US caused by CT studies? [Brenner and Hall, NEJM, 2007] Cone-beam specific lower the dose, ie : do more with less Cone-beam artifacts

7 Medical images are piecewise constant... ar.in.tum.de

8 ... i.e. the gradient of medical images is sparse Sparsity brsoc.org.uk

9 ... i.e. the gradient of medical images is sparse Sparsity Sparse gradient ( f f = x, f y, f z ) t i f (x i) 2 is small Few transitions sharp edges between flat regions [Defrise et al., 2011; Rudin et al., 1992; Sidky and Pan, 2008; Sidky et al., 2006] brsoc.org.uk

10 CT acquisition and reconstruction Attenuation of an X-ray beam I = I 0 exp y = def ( µ(s) = log ) µ(s)ds ( ) I0 I

11 CT acquisition and reconstruction Attenuation of an X-ray beam CT acquisition I = I 0 exp y = def ( µ(s) = log ) µ(s)ds ( ) I0 I image : data : x R M y R P J (P projections) projector : K : R M R P J : x {y p = K p x} p=1..p

12 CT acquisition and reconstruction Attenuation of an X-ray beam CT acquisition I = I 0 exp y = def ( µ(s) = log ) µ(s)ds ( ) I0 I image : data : x R M y R P J (P projections) projector : K : R M R P J : x {y p = K p x} p=1..p Cost function Φ(x, y) = G(x, y) + λ H(A x)

13 CT acquisition and reconstruction Attenuation of an X-ray beam CT acquisition I = I 0 exp y = def ( µ(s) = log ) µ(s)ds ( ) I0 I image : data : x R M y R P J (P projections) projector : K : R M R P J : x {y p = K p x} p=1..p Cost function Reconstruction Φ(x, y) = G(x, y) + λ H(A x) x = arg min x Φ(x, y)

14 Cost function Φ(x, y) = G(x, y) + λ H(A x)

15 Cost function Φ(x, y) = G(x, y) + λ H(A x) Data term : G convex + smooth G(x, y) = 1 K x y 2 2

16 Cost function Φ(x, y) = G(x, y) + λ H(A x) Data term : G convex + smooth G(x, y) = 1 K x y 2 2 Penalty term : H convex + non smooth with A : any linear operator

17 Cost function Φ(x, y) = G(x, y) + λ H(A x) Data term : G convex + smooth G(x, y) = 1 K x y 2 2 Penalty term : H convex + non smooth with A : any linear operator Total variation penalty [isotropic] x i x i [100] (A x) i = ( x) i = x i x i [010] x i x i [001] H(z) = z 1

18 How to solve?

19 Simultaneous Algebraic Reconstruction Technique [Andersen and Kak, 1984] Initialization x 0 : arbitrary image R M 0 < τ p < 2/ K p K T p.

20 Simultaneous Algebraic Reconstruction Technique [Andersen and Kak, 1984] Initialization x 0 : arbitrary image R M 0 < τ p < 2/ K p K T p. Iteration : x n+1 = I SART (x n ) x (n,0) = x n for p=0... P-1: x (n,p+1) = x (n,p) + τ p K T p ( yp K p x (n,p)) x n+1 = x (n,p)

21 Simultaneous Algebraic Reconstruction Technique [Andersen and Kak, 1984] Initialization x 0 : arbitrary image R M 0 < τ p < 2/ K p K T p. Iteration : x n+1 = I SART (x n ) x (n,0) = x n for p=0... P-1: x (n,p+1) = x (n,p) + τ p K T p ( yp K p x (n,p)) x n+1 = x (n,p) No account for any penalty

22 Current algorithms suited for TV us.123rf.com

23 Current algorithms suited for TV Algorithms PICCS, ASD-POCS : alternates the minimization of data and of TV { } ISTA : x k = arg min x H(x) + 1 x (x 2t k 1 t k G(x k 1 )) 2 k SART-TV : uses a surrogate and a differentiable TV

24 Current algorithms suited for TV Algorithms PICCS, ASD-POCS : alternates the minimization of data and of TV { } ISTA : x k = arg min x H(x) + 1 x (x 2t k 1 t k G(x k 1 )) 2 k SART-TV : uses a surrogate and a differentiable TV [PICCS : [Chen et al., 2008], ASD-POCS : [Ramani and Fessler, 2012; Sidky and Pan, 2008], ISTA : [Beck and Teboulle, 2009b; Daubechies et al., 2004], SART-TV : [Defrise et al., 2011]] Drawbacks need of imbricated iterations when using TV (all)

25 Current algorithms suited for TV Algorithms PICCS, ASD-POCS : alternates the minimization of data and of TV { } ISTA : x k = arg min x H(x) + 1 x (x 2t k 1 t k G(x k 1 )) 2 k SART-TV : uses a surrogate and a differentiable TV [PICCS : [Chen et al., 2008], ASD-POCS : [Ramani and Fessler, 2012; Sidky and Pan, 2008], ISTA : [Beck and Teboulle, 2009b; Daubechies et al., 2004], SART-TV : [Defrise et al., 2011]] Drawbacks need of imbricated iterations when using TV (all) need of a differentiable penalty (SART-TV)

26 Current algorithms suited for TV Algorithms PICCS, ASD-POCS : alternates the minimization of data and of TV { } ISTA : x k = arg min x H(x) + 1 x (x 2t k 1 t k G(x k 1 )) 2 k SART-TV : uses a surrogate and a differentiable TV [PICCS : [Chen et al., 2008], ASD-POCS : [Ramani and Fessler, 2012; Sidky and Pan, 2008], ISTA : [Beck and Teboulle, 2009b; Daubechies et al., 2004], SART-TV : [Defrise et al., 2011]] Drawbacks need of imbricated iterations when using TV (all) need of a differentiable penalty (SART-TV) no proof of convergence (PICCS, ASD-POCS)

27 Current algorithms suited for TV Algorithms PICCS, ASD-POCS : alternates the minimization of data and of TV { } ISTA : x k = arg min x H(x) + 1 x (x 2t k 1 t k G(x k 1 )) 2 k SART-TV : uses a surrogate and a differentiable TV [PICCS : [Chen et al., 2008], ASD-POCS : [Ramani and Fessler, 2012; Sidky and Pan, 2008], ISTA : [Beck and Teboulle, 2009b; Daubechies et al., 2004], SART-TV : [Defrise et al., 2011]] Drawbacks need of imbricated iterations when using TV (all) need of a differentiable penalty (SART-TV) no proof of convergence (PICCS, ASD-POCS) slow convergence (ISTA) : Φ(x n, y) Φ(x, y) n 1

28 FISTA accelerates the convergence of ISTA [Beck and Teboulle, 2009a] Initialization x 1 = 0 x 0 : an arbitrary image R M t 0 = 1 θ 0 = 0

29 FISTA accelerates the convergence of ISTA [Beck and Teboulle, 2009a] Initialization x 1 = 0 x 0 : an arbitrary image R M t 0 = 1 θ 0 = 0 Iteration x n+1 = I FISTA (x n, x n 1 ) x n+1 = I ISTA ( (1 + θn ) x n θ n x n 1)

30 FISTA accelerates the convergence of ISTA [Beck and Teboulle, 2009a] Initialization x 1 = 0 x 0 : an arbitrary image R M t 0 = 1 θ 0 = 0 Iteration x n+1 = I FISTA (x n, x n 1 ) x n+1 = I ISTA ( (1 + θn ) x n θ n x n 1) (t n+1, θ n+1 ) = s(t n )

31 FISTA accelerates the convergence of ISTA [Beck and Teboulle, 2009a] Initialization x 1 = 0 x 0 : an arbitrary image R M t 0 = 1 θ 0 = 0 Iteration x n+1 = I FISTA (x n, x n 1 ) x n+1 = I ISTA ( (1 + θn ) x n θ n x n 1) (t n+1, θ n+1 ) = s(t n ) with s(t n) = ( ) tn 2, tn 1. 2 t n+1

32 FISTA accelerates the convergence of ISTA [Beck and Teboulle, 2009a] Initialization x 1 = 0 x 0 : an arbitrary image R M t 0 = 1 θ 0 = 0 Iteration x n+1 = I FISTA (x n, x n 1 ) x n+1 = I ISTA ( (1 + θn ) x n θ n x n 1) (t n+1, θ n+1 ) = s(t n ) with s(t n) = ( ) tn 2, tn 1. 2 t n+1 Speed of convergence Φ(x n, y) Φ(x, y) n 2.

33 A way to overcome the difficulties : Generalized ISTA (GISTA) [Loris and Verhoeven, 2011] Cost function Φ(x, y) = G(x, y) + λ H(A x) suitable for A = reduces to ISTA for A orthogonal no internal iteration proven convergence

34 GISTA [Loris and Verhoeven, 2011] Initialization x 0 : arbitrary image R M w 0 = 0 R D M τ < 2/ K T K σ < 1/ AA T.

35 GISTA [Loris and Verhoeven, 2011] Initialization x 0 : arbitrary image R M τ < 2/ K T K w 0 = 0 R D M σ < 1/ AA T. Iteration : (x n+1, w n+1 ) = I GISTA (x n, w n ) x n+1 = x n + τk T (y K x n )

36 GISTA [Loris and Verhoeven, 2011] Initialization x 0 : arbitrary image R M τ < 2/ K T K w 0 = 0 R D M σ < 1/ AA T. Iteration : (x n+1, w n+1 ) = I GISTA (x n, w n ) x n+1 = x n + τk T (y K x n ) τ T w n

37 GISTA [Loris and Verhoeven, 2011] Initialization x 0 : arbitrary image R M τ < 2/ K T K w 0 = 0 R D M σ < 1/ AA T. Iteration : (x n+1, w n+1 ) = I GISTA (x n, w n ) x n+1 = x n + τk T (y K x n ) τ T w n ( w n+1 = P λ w n + σ x n+1) τ

38 GISTA [Loris and Verhoeven, 2011] Initialization x 0 : arbitrary image R M τ < 2/ K T K w 0 = 0 R D M σ < 1/ AA T. Iteration : (x n+1, w n+1 ) = I GISTA (x n, w n ) x n+1 = x n + τk T (y K x n ) τ T w n ( w n+1 = P λ w n + σ x n+1) τ { λ ui / u P λ (u) = i if u i > λ u i if u i λ with u i R D, and u i = ui,x 2 + u2 i,y + u2 i,z. } i=1..m

39 GISTA [Loris and Verhoeven, 2011] Initialization x 0 : arbitrary image R M τ < 2/ K T K w 0 = 0 R D M σ < 1/ AA T. Iteration : (x n+1, w n+1 ) = I GISTA (x n, w n ) x n+1 = x n + τk T (y K x n ) τ T w n ( w n+1 = P λ w n + σ x n+1) τ x n+1 = x n + τk T (y K x n ) τ T w n+1. { λ ui / u P λ (u) = i if u i > λ u i if u i λ with u i R D, and u i = ui,x 2 + u2 i,y + u2 i,z. } i=1..m

40 But GISTA is slow cross section of a mouse, short scan, 98 projections, acquired on the Skyscan 1178, after 260 (left) and 1000 iterations (right), reconstructed with GISTA and λ =.01

41 This work How to go as fast as possible? initialization restart FGISTA

42 Numerical experiment Dataset Poisson noise : 10 4 photons/lor forbild thorax phantom P = 200 projections of J = 600 pixels imp.uni-erlangen.de Image 2D, M =

43 GISTA : initialization matters How many initial SART iterations lead to lowest cost within N < 10 4 iterations? cost function λ =.0025 : starts with... iter of SART # initial SART iter iteration λ # of initial SART iter when λ

44 Restarted GISTA Initialization: M 1 iterations of SART Iteration: (1) perform 1 iteration of SART (2) run GISTA during N iter (3) set w = 0 (4) back to (1) Inspired by the restart of conjugate gradient (see also: [O Donoghue and Candès, 2012; Powell, 1977; Sidky and Pan, 2008])

45 Restarted GISTA (RGISTA) Does RGISTA lead to a lower cost within N < 10 2 iterations?

46 Restarted GISTA (RGISTA) Does RGISTA lead to a lower cost within N < 10 2 iterations? λ =.5 cost function no restart restart after 1 restart after iteration NO

47 Restarted GISTA (RGISTA) Does RGISTA lead to a lower cost within N < 10 2 iterations? cost function λ =.5 no restart restart after 1 restart after iteration cost function λ =.0025 restart after 2 no restart iteration NO YES

48 Restarted GISTA (RGISTA) Does RGISTA lead to a lower cost within N < 10 2 iterations? cost function λ =.5 no restart restart after 1 restart after iteration cost function λ =.0025 restart after 2 no restart iteration NO YES e.g. restart after 2 iter 6 iter instead of 18.

49 Restarted GISTA (RGISTA) Does RGISTA lead to a lower cost within N < 10 2 iterations? cost function λ =.5 no restart restart after 1 restart after iteration cost function λ =.0025 restart after 2 no restart iteration NO YES e.g. restart after 2 iter 6 iter instead of 18. efficiency of RGISTA depends on λ

50 Restarted GISTA (RGISTA) λ = 0.5 λ = iter 5000 iter 4 iter 5000 iter

51 FGISTA Initialization x 1 = 0 x 0 : an arbitrary image R M w 1 = w 0 = 0 R D M t 0 = 1, θ 0 = 0

52 FGISTA Initialization x 1 = 0 x 0 : an arbitrary image R M w 1 = w 0 = 0 R D M t 0 = 1, θ 0 = 0 Iteration : (x n, w n ) = I FGISTA (x n, w n, x n 1, w n 1 ) v n = (1 + θ n ) x n θ n x n 1 z n = (1 + θ n ) w n θ n w n 1 (x n, w n ) = I GISTA (v n, z n ) (t n+1, θ n+1 ) = s(t n )

53 FGISTA Initialization x 1 = 0 x 0 : an arbitrary image R M w 1 = w 0 = 0 R D M t 0 = 1, θ 0 = 0 Iteration : (x n, w n ) = I FGISTA (x n, w n, x n 1, w n 1 ) v n = (1 + θ n ) x n θ n x n 1 z n = (1 + θ n ) w n θ n w n 1 (x n, w n ) = I GISTA (v n, z n ) (t n+1, θ n+1 ) = s(t n ) same fixed points as GISTA

54 FGISTA Initialization x 1 = 0 x 0 : an arbitrary image R M w 1 = w 0 = 0 R D M t 0 = 1, θ 0 = 0 Iteration : (x n, w n ) = I FGISTA (x n, w n, x n 1, w n 1 ) v n = (1 + θ n ) x n θ n x n 1 z n = (1 + θ n ) w n θ n w n 1 (x n, w n ) = I GISTA (v n, z n ) (t n+1, θ n+1 ) = s(t n ) same fixed points as GISTA reduces to FISTA when A is orthogonal

55 FGISTA Initialization x 1 = 0 x 0 : an arbitrary image R M w 1 = w 0 = 0 R D M t 0 = 1, θ 0 = 0 Iteration : (x n, w n ) = I FGISTA (x n, w n, x n 1, w n 1 ) v n = (1 + θ n ) x n θ n x n 1 z n = (1 + θ n ) w n θ n w n 1 (x n, w n ) = I GISTA (v n, z n ) (t n+1, θ n+1 ) = s(t n ) same fixed points as GISTA reduces to FISTA when A is orthogonal no proof of convergence

56 FGISTA cost function λ =.25 GISTA FGISTA FGISTA switched to GISTA after 15 iter iteration

57 FGISTA FGISTA, λ=.25, 100 iter GISTA, λ=.25, 100 iter y x x profiles FGISTA, λ=.25, 100 iter GISTA, λ=.25, 100 iter y

58 Discussion FGISTA and GISTA : share the same fixed points

59 Discussion FGISTA and GISTA : share the same fixed points Why do FISTA and GISTA not converge to the same values? rounding errors in the algorithm? limit cycle? other update of the parameters? (cf. Chambolle-Pocq)

60 Discussion FGISTA and GISTA : share the same fixed points Why do FISTA and GISTA not converge to the same values? rounding errors in the algorithm? limit cycle? other update of the parameters? (cf. Chambolle-Pocq) A fixed point algorithm that appears to converge numerically does not necessarily min Φ.

61 Discussion FGISTA and GISTA : share the same fixed points Why do FISTA and GISTA not converge to the same values? rounding errors in the algorithm? limit cycle? other update of the parameters? (cf. Chambolle-Pocq) A fixed point algorithm that appears to converge numerically does not necessarily min Φ.

62 Open issues How to determine, on the fly the optimal initialization?

63 Open issues How to determine, on the fly the optimal initialization? the optimal # and position of the restarts?

64 Open issues How to determine, on the fly the optimal initialization? the optimal # and position of the restarts? Why is cost(fgista) > cost(gista)?

65 Remember GISTA reconstructs CT images with proven convergence no internal iteration

66 Remember GISTA reconstructs CT images with proven convergence no internal iteration Initialization matters.

67 Remember GISTA reconstructs CT images with proven convergence no internal iteration Initialization matters. Restart and FGISTA may help further.

68 References A. H. Andersen and A. C. Kak. Simultaneous algebraic reconstruction technique (sart): A superior implementation of the art algorithm. Ultrasonic Imaging, 6:81 94, Amir Beck and Marc Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. Siam J. Imaging Sciences, 2: , 2009a. Amir Beck and Marc Teboulle. Fast gradient-based algorithms for constrained total variation image denoising and deblurring problems. 2009b. G.-H. Chen, J. Tang, and S. Leng. Prior image constrained compressed sensing (piccs): A method to accurately reconstruct dynamic ct images from highly undersampled projection data sets. Med. Phys., AAPM, 35: , I. Daubechies, M. Defrise, and C. De Mol. An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Communications on Pure and Applied Mathematics, 57(11): , August ISSN doi: /cpa URL Michel Defrise, Christian Vanhove, and Xuan Liu. An algorithm for total variation regularization in high-dimensional linear problems. Inverse Problems, 27(6):065002, June ISSN , doi: / /27/6/ URL Ignace Loris and Caroline Verhoeven. On a generalization of the iterative soft-thresholding algorithm for the case of non-separable penalty. Inverse Problems, 27:125007, doi: / /27/12/ URL B. O Donoghue and E. Candès. Adaptive restart for accelerated gradient schemes. arxiv: , april M. J. D. Powell. Restart procedures for the conjugate gradient method. Mathematical programming, 12: , S. Ramani and J. A. Fessler. A splitting-based iterative algorithm for accelerated statistical x-ray ct reconstruction. IEEE Trans Med Imaging, 31(3): , L. Rudin, S. Osher, and E. Fatemi. Nonlinear total variation based noise removal algorithms. Physica D, 60: , URL lomn/cours/ece/physicarudinosher.pdf. Emil Y Sidky and Xiaochuan Pan. Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization. Physics in Medicine and Biology, 53(17): , September ISSN , doi: / /53/17/021. URL Emil Y. Sidky, Chien-Min Kao, and Xiaochuan Pan. Accurate image reconstruction from few-views and limited-angle data in divergent-beam ct. J. X-Ray Sci. Technol, 14: , 2006.

69 Chambolle-Pocq derived algorithm Initialization p 0 = 0 R P J w 0 = 0 R D M x 0 = 0 R M τσ < 1/ K, Iteration : (p n+1, w n+1, x n+1 ) = I CP (p n, w n, x n ) p n+1 = (p n + σ(y K x n )) /(1 + σ) w n+1 = P λ (w n + σ x n+1 ) x n+1 = x n + τk T p n+1 τ T w n+1. Caution must be taken to adapt the dimensions of K to those of the.

70 Philips Brightview XCT TV leads to less noise, flat regions and sharp edges Flat panel: sq. el., side of mm. Images: , resol. of 0.8 mm. 6 x 10 5 (a) (c) (b) (d) intensity position (voxels) Figure : (a) SART, P = 720 views, 5 iterations. (b) SART, P = 100 views, 5 iterations. (c) profiles (blue=sart, P = 720, red=gista, P = 720). (d) GISTA P = 720 views, λ = 0.018, 30 iterations. The images are average of 3 consecutive slices.

71 Philips Brightview XCT: RISTA vs GISTA TV intensity 55 (a) LS 6 x 10 5 (b) position (pixels) (c) (d) Figure : (a) (LS, TV) curves for GISTA, P = 100 views, λ = The 41th iter. are highlighted by a cross. (b) Profiles of the 41th iteration. (c) GISTA, 41 iterations. (d) RISTA, 41 iterations. The images are average of 3 consecutive slices. In this figure, blue=gist without restart (GISTA), green=gist with restart after the 10th iteration (RISTA). See Fig. 1 for SART with 100 views.

72 Skyscan

73 Skyscan

74 Skyscan

75 Skyscan

76 Skyscan

77 Skyscan

Minimizing Isotropic Total Variation without Subiterations

Minimizing Isotropic Total Variation without Subiterations MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Minimizing Isotropic Total Variation without Subiterations Kamilov, U. S. TR206-09 August 206 Abstract Total variation (TV) is one of the most

More information

Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems)

Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems) Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems) Donghwan Kim and Jeffrey A. Fessler EECS Department, University of Michigan

More information

SPARSE SIGNAL RESTORATION. 1. Introduction

SPARSE SIGNAL RESTORATION. 1. Introduction SPARSE SIGNAL RESTORATION IVAN W. SELESNICK 1. Introduction These notes describe an approach for the restoration of degraded signals using sparsity. This approach, which has become quite popular, is useful

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

Optimized first-order minimization methods

Optimized first-order minimization methods Optimized first-order minimization methods Donghwan Kim & Jeffrey A. Fessler EECS Dept., BME Dept., Dept. of Radiology University of Michigan web.eecs.umich.edu/~fessler UM AIM Seminar 2014-10-03 1 Disclosure

More information

Learning MMSE Optimal Thresholds for FISTA

Learning MMSE Optimal Thresholds for FISTA MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Learning MMSE Optimal Thresholds for FISTA Kamilov, U.; Mansour, H. TR2016-111 August 2016 Abstract Fast iterative shrinkage/thresholding algorithm

More information

Adaptive Primal Dual Optimization for Image Processing and Learning

Adaptive Primal Dual Optimization for Image Processing and Learning Adaptive Primal Dual Optimization for Image Processing and Learning Tom Goldstein Rice University tag7@rice.edu Ernie Esser University of British Columbia eesser@eos.ubc.ca Richard Baraniuk Rice University

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

MMSE Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm

MMSE Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm Bodduluri Asha, B. Leela kumari Abstract: It is well known that in a real world signals do not exist without noise, which may be negligible

More information

Agenda. Fast proximal gradient methods. 1 Accelerated first-order methods. 2 Auxiliary sequences. 3 Convergence analysis. 4 Numerical examples

Agenda. Fast proximal gradient methods. 1 Accelerated first-order methods. 2 Auxiliary sequences. 3 Convergence analysis. 4 Numerical examples Agenda Fast proximal gradient methods 1 Accelerated first-order methods 2 Auxiliary sequences 3 Convergence analysis 4 Numerical examples 5 Optimality of Nesterov s scheme Last time Proximal gradient method

More information

A Study of Numerical Algorithms for Regularized Poisson ML Image Reconstruction

A Study of Numerical Algorithms for Regularized Poisson ML Image Reconstruction A Study of Numerical Algorithms for Regularized Poisson ML Image Reconstruction Yao Xie Project Report for EE 391 Stanford University, Summer 2006-07 September 1, 2007 Abstract In this report we solved

More information

Restoration of Missing Data in Limited Angle Tomography Based on Helgason- Ludwig Consistency Conditions

Restoration of Missing Data in Limited Angle Tomography Based on Helgason- Ludwig Consistency Conditions Restoration of Missing Data in Limited Angle Tomography Based on Helgason- Ludwig Consistency Conditions Yixing Huang, Oliver Taubmann, Xiaolin Huang, Guenter Lauritsch, Andreas Maier 26.01. 2017 Pattern

More information

Two-Material Decomposition From a Single CT Scan Using Statistical Image Reconstruction

Two-Material Decomposition From a Single CT Scan Using Statistical Image Reconstruction / 5 Two-Material Decomposition From a Single CT Scan Using Statistical Image Reconstruction Yong Long and Jeffrey A. Fessler EECS Department James M. Balter Radiation Oncology Department The University

More information

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in

More information

Regularizing inverse problems using sparsity-based signal models

Regularizing inverse problems using sparsity-based signal models Regularizing inverse problems using sparsity-based signal models Jeffrey A. Fessler William L. Root Professor of EECS EECS Dept., BME Dept., Dept. of Radiology University of Michigan http://web.eecs.umich.edu/

More information

Projected Nesterov s Proximal-Gradient Signal Recovery from Compressive Poisson Measurements

Projected Nesterov s Proximal-Gradient Signal Recovery from Compressive Poisson Measurements Electrical and Computer Engineering Conference Papers, Posters and Presentations Electrical and Computer Engineering 2015 Projected Nesterov s Proximal-Gradient Signal Recovery from Compressive Poisson

More information

Approximate Message Passing Algorithms

Approximate Message Passing Algorithms November 4, 2017 Outline AMP (Donoho et al., 2009, 2010a) Motivations Derivations from a message-passing perspective Limitations Extensions Generalized Approximate Message Passing (GAMP) (Rangan, 2011)

More information

DNNs for Sparse Coding and Dictionary Learning

DNNs for Sparse Coding and Dictionary Learning DNNs for Sparse Coding and Dictionary Learning Subhadip Mukherjee, Debabrata Mahapatra, and Chandra Sekhar Seelamantula Department of Electrical Engineering, Indian Institute of Science, Bangalore 5612,

More information

NIH Public Access Author Manuscript Inverse Probl. Author manuscript; available in PMC 2010 April 20.

NIH Public Access Author Manuscript Inverse Probl. Author manuscript; available in PMC 2010 April 20. NIH Public Access Author Manuscript Published in final edited form as: Inverse Probl. 2010 January 1; 26(3): 350131 3501329. doi:10.1088/0266-5611/26/3/035013. High Order Total Variation Minimization for

More information

A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization

A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization Panos Parpas Department of Computing Imperial College London www.doc.ic.ac.uk/ pp500 p.parpas@imperial.ac.uk jointly with D.V.

More information

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Sparse Recovery using L1 minimization - algorithms Yuejie Chi Department of Electrical and Computer Engineering Spring

More information

A Tutorial on Primal-Dual Algorithm

A Tutorial on Primal-Dual Algorithm A Tutorial on Primal-Dual Algorithm Shenlong Wang University of Toronto March 31, 2016 1 / 34 Energy minimization MAP Inference for MRFs Typical energies consist of a regularization term and a data term.

More information

A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration

A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration E. Chouzenoux, A. Jezierska, J.-C. Pesquet and H. Talbot Université Paris-Est Lab. d Informatique Gaspard

More information

F3-A2/F3-A3: Tensor-based formulation for Spectral Computed Tomography (CT) with novel regularization techniques

F3-A2/F3-A3: Tensor-based formulation for Spectral Computed Tomography (CT) with novel regularization techniques F3-A2/F3-A3: Tensor-based formulation for Spectral Computed Tomography (CT) with novel regularization techniques Abstract Spectral computed tomography (CT) has become possible with the development of photon

More information

Relaxed linearized algorithms for faster X-ray CT image reconstruction

Relaxed linearized algorithms for faster X-ray CT image reconstruction Relaxed linearized algorithms for faster X-ray CT image reconstruction Hung Nien and Jeffrey A. Fessler University of Michigan, Ann Arbor The 13th Fully 3D Meeting June 2, 2015 1/20 Statistical image reconstruction

More information

A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation

A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation Zhouchen Lin Peking University April 22, 2018 Too Many Opt. Problems! Too Many Opt. Algorithms! Zero-th order algorithms:

More information

2D X-Ray Tomographic Reconstruction From Few Projections

2D X-Ray Tomographic Reconstruction From Few Projections 2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory

More information

arxiv: v3 [math.oc] 29 Jun 2016

arxiv: v3 [math.oc] 29 Jun 2016 MOCCA: mirrored convex/concave optimization for nonconvex composite functions Rina Foygel Barber and Emil Y. Sidky arxiv:50.0884v3 [math.oc] 9 Jun 06 04.3.6 Abstract Many optimization problems arising

More information

Fast proximal gradient methods

Fast proximal gradient methods L. Vandenberghe EE236C (Spring 2013-14) Fast proximal gradient methods fast proximal gradient method (FISTA) FISTA with line search FISTA as descent method Nesterov s second method 1 Fast (proximal) gradient

More information

SEAGLE: Robust Computational Imaging under Multiple Scattering

SEAGLE: Robust Computational Imaging under Multiple Scattering MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com SEAGLE: Robust Computational Imaging under Multiple Scattering Liu, H.-Y.; Liu, D.; Mansour, H.; Boufounos, P.T.; Waller, L.; Kamilov, U. TR217-81

More information

Optimization for Learning and Big Data

Optimization for Learning and Big Data Optimization for Learning and Big Data Donald Goldfarb Department of IEOR Columbia University Department of Mathematics Distinguished Lecture Series May 17-19, 2016. Lecture 1. First-Order Methods for

More information

Preconditioned ADMM with nonlinear operator constraint

Preconditioned ADMM with nonlinear operator constraint Preconditioned ADMM with nonlinear operator constraint Martin Benning, Florian Knoll, Carola-Bibiane Schönlieb, and Tuomo Valkonen University of Cambridge, Department of Applied Mathematics and Theoretical

More information

Variational Image Restoration

Variational Image Restoration Variational Image Restoration Yuling Jiao yljiaostatistics@znufe.edu.cn School of and Statistics and Mathematics ZNUFE Dec 30, 2014 Outline 1 1 Classical Variational Restoration Models and Algorithms 1.1

More information

arxiv: v1 [math.na] 2 Nov 2015

arxiv: v1 [math.na] 2 Nov 2015 Preconditioned ADMM with nonlinear operator constraint Martin Benning, Florian Knoll, Carola-Bibiane Schönlieb, and Tuomo Valkonen arxiv:5.45v [math.na] Nov 5 University of Cambridge, Department of Applied

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

COMPRESSIVE OPTICAL DEFLECTOMETRIC TOMOGRAPHY

COMPRESSIVE OPTICAL DEFLECTOMETRIC TOMOGRAPHY COMPRESSIVE OPTICAL DEFLECTOMETRIC TOMOGRAPHY Adriana González ICTEAM/UCL March 26th, 214 1 ISPGroup - ICTEAM - UCL http://sites.uclouvain.be/ispgroup Université catholique de Louvain, Louvain-la-Neuve,

More information

Minimizing the Difference of L 1 and L 2 Norms with Applications

Minimizing the Difference of L 1 and L 2 Norms with Applications 1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:

More information

Master 2 MathBigData. 3 novembre CMAP - Ecole Polytechnique

Master 2 MathBigData. 3 novembre CMAP - Ecole Polytechnique Master 2 MathBigData S. Gaïffas 1 3 novembre 2014 1 CMAP - Ecole Polytechnique 1 Supervised learning recap Introduction Loss functions, linearity 2 Penalization Introduction Ridge Sparsity Lasso 3 Some

More information

This can be 2 lectures! still need: Examples: non-convex problems applications for matrix factorization

This can be 2 lectures! still need: Examples: non-convex problems applications for matrix factorization This can be 2 lectures! still need: Examples: non-convex problems applications for matrix factorization x = prox_f(x)+prox_{f^*}(x) use to get prox of norms! PROXIMAL METHODS WHY PROXIMAL METHODS Smooth

More information

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München, München, Arcistraße

More information

Lasso: Algorithms and Extensions

Lasso: Algorithms and Extensions ELE 538B: Sparsity, Structure and Inference Lasso: Algorithms and Extensions Yuxin Chen Princeton University, Spring 2017 Outline Proximal operators Proximal gradient methods for lasso and its extensions

More information

A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction

A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction Marc Teboulle School of Mathematical Sciences Tel Aviv University Joint work with H. Bauschke and J. Bolte Optimization

More information

1 Sparsity and l 1 relaxation

1 Sparsity and l 1 relaxation 6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the

More information

Iterative Data Refinement for Soft X-ray Microscopy

Iterative Data Refinement for Soft X-ray Microscopy Iterative Data Refinement for Soft X-ray Microscopy Presented by Joanna Klukowska August 2, 2013 Outline Iterative Data Refinement Transmission X-ray Microscopy Numerical Tests Joanna Klukowska IDR for

More information

X-ray scattering tomography for biological applications

X-ray scattering tomography for biological applications Journal of X-Ray Science and Technology 19 (2011) 219 227 219 DOI 10.3233/XST-2011-0288 IOS Press X-ray scattering tomography for biological applications W. Cong and G. Wang Biomedical Imaging Division,

More information

Projected Nesterov s Proximal-Gradient Algorithm for Sparse Signal Reconstruction with a Convex Constraint

Projected Nesterov s Proximal-Gradient Algorithm for Sparse Signal Reconstruction with a Convex Constraint 1 Projected Nesterov s Proximal-Gradient Algorithm for Sparse Signal Reconstruction with a Convex Constraint Renliang Gu and Aleksandar Dogandžić arxiv:1502.02613v6 [stat.co] 5 Oct 2016 Abstract We develop

More information

The relationship between image noise and spatial resolution of CT scanners

The relationship between image noise and spatial resolution of CT scanners The relationship between image noise and spatial resolution of CT scanners Sue Edyvean, Nicholas Keat, Maria Lewis, Julia Barrett, Salem Sassi, David Platten ImPACT*, St George s Hospital, London *An MDA

More information

On the acceleration of the double smoothing technique for unconstrained convex optimization problems

On the acceleration of the double smoothing technique for unconstrained convex optimization problems On the acceleration of the double smoothing technique for unconstrained convex optimization problems Radu Ioan Boţ Christopher Hendrich October 10, 01 Abstract. In this article we investigate the possibilities

More information

Enhanced Compressive Sensing and More

Enhanced Compressive Sensing and More Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University

More information

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net

More information

Bayesian Paradigm. Maximum A Posteriori Estimation

Bayesian Paradigm. Maximum A Posteriori Estimation Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)

More information

Primal-dual algorithms for the sum of two and three functions 1

Primal-dual algorithms for the sum of two and three functions 1 Primal-dual algorithms for the sum of two and three functions 1 Ming Yan Michigan State University, CMSE/Mathematics 1 This works is partially supported by NSF. optimization problems for primal-dual algorithms

More information

Dual and primal-dual methods

Dual and primal-dual methods ELE 538B: Large-Scale Optimization for Data Science Dual and primal-dual methods Yuxin Chen Princeton University, Spring 2018 Outline Dual proximal gradient method Primal-dual proximal gradient method

More information

Low-rank Promoting Transformations and Tensor Interpolation - Applications to Seismic Data Denoising

Low-rank Promoting Transformations and Tensor Interpolation - Applications to Seismic Data Denoising Low-rank Promoting Transformations and Tensor Interpolation - Applications to Seismic Data Denoising Curt Da Silva and Felix J. Herrmann 2 Dept. of Mathematics 2 Dept. of Earth and Ocean Sciences, University

More information

Comparative Study of Restoration Algorithms ISTA and IISTA

Comparative Study of Restoration Algorithms ISTA and IISTA Comparative Study of Restoration Algorithms ISTA and IISTA Kumaresh A K 1, Kother Mohideen S 2, Bremnavas I 3 567 Abstract our proposed work is to compare iterative shrinkage thresholding algorithm (ISTA)

More information

Stochastic Optimization: First order method

Stochastic Optimization: First order method Stochastic Optimization: First order method Taiji Suzuki Tokyo Institute of Technology Graduate School of Information Science and Engineering Department of Mathematical and Computing Sciences JST, PRESTO

More information

OWL to the rescue of LASSO

OWL to the rescue of LASSO OWL to the rescue of LASSO IISc IBM day 2018 Joint Work R. Sankaran and Francis Bach AISTATS 17 Chiranjib Bhattacharyya Professor, Department of Computer Science and Automation Indian Institute of Science,

More information

Scan Time Optimization for Post-injection PET Scans

Scan Time Optimization for Post-injection PET Scans Presented at 998 IEEE Nuc. Sci. Symp. Med. Im. Conf. Scan Time Optimization for Post-injection PET Scans Hakan Erdoğan Jeffrey A. Fessler 445 EECS Bldg., University of Michigan, Ann Arbor, MI 4809-222

More information

Recent developments on sparse representation

Recent developments on sparse representation Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last

More information

Lecture 9: September 28

Lecture 9: September 28 0-725/36-725: Convex Optimization Fall 206 Lecturer: Ryan Tibshirani Lecture 9: September 28 Scribes: Yiming Wu, Ye Yuan, Zhihao Li Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These

More information

Optimization methods

Optimization methods Optimization methods Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda /8/016 Introduction Aim: Overview of optimization methods that Tend to

More information

Optimization methods

Optimization methods Lecture notes 3 February 8, 016 1 Introduction Optimization methods In these notes we provide an overview of a selection of optimization methods. We focus on methods which rely on first-order information,

More information

Accelerated MRI Image Reconstruction

Accelerated MRI Image Reconstruction IMAGING DATA EVALUATION AND ANALYTICS LAB (IDEAL) CS5540: Computational Techniques for Analyzing Clinical Data Lecture 15: Accelerated MRI Image Reconstruction Ashish Raj, PhD Image Data Evaluation and

More information

A Unified Approach to Proximal Algorithms using Bregman Distance

A Unified Approach to Proximal Algorithms using Bregman Distance A Unified Approach to Proximal Algorithms using Bregman Distance Yi Zhou a,, Yingbin Liang a, Lixin Shen b a Department of Electrical Engineering and Computer Science, Syracuse University b Department

More information

LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING

LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING JIAN-FENG CAI, STANLEY OSHER, AND ZUOWEI SHEN Abstract. Real images usually have sparse approximations under some tight frame systems derived

More information

Objective Functions for Tomographic Reconstruction from. Randoms-Precorrected PET Scans. gram separately, this process doubles the storage space for

Objective Functions for Tomographic Reconstruction from. Randoms-Precorrected PET Scans. gram separately, this process doubles the storage space for Objective Functions for Tomographic Reconstruction from Randoms-Precorrected PET Scans Mehmet Yavuz and Jerey A. Fessler Dept. of EECS, University of Michigan Abstract In PET, usually the data are precorrected

More information

CONE-BEAM computed tomography (CBCT) is a widely

CONE-BEAM computed tomography (CBCT) is a widely 1 Scale-Space Anisotropic Total Variation for Limited Angle Tomography Yixing Huang, Oliver Taubmann, Xiaolin Huang, Viktor Haase, Guenter Lauritsch, and Andreas Maier Abstract This paper addresses streak

More information

An unconstrained multiphase thresholding approach for image segmentation

An unconstrained multiphase thresholding approach for image segmentation An unconstrained multiphase thresholding approach for image segmentation Benjamin Berkels Institut für Numerische Simulation, Rheinische Friedrich-Wilhelms-Universität Bonn, Nussallee 15, 53115 Bonn, Germany

More information

Signal Restoration with Overcomplete Wavelet Transforms: Comparison of Analysis and Synthesis Priors

Signal Restoration with Overcomplete Wavelet Transforms: Comparison of Analysis and Synthesis Priors Signal Restoration with Overcomplete Wavelet Transforms: Comparison of Analysis and Synthesis Priors Ivan W. Selesnick a and Mário A. T. Figueiredo b a Polytechnic Institute of New York University, Brooklyn,

More information

Sparse Regularization via Convex Analysis

Sparse Regularization via Convex Analysis Sparse Regularization via Convex Analysis Ivan Selesnick Electrical and Computer Engineering Tandon School of Engineering New York University Brooklyn, New York, USA 29 / 66 Convex or non-convex: Which

More information

Lecture 1: September 25

Lecture 1: September 25 0-725: Optimization Fall 202 Lecture : September 25 Lecturer: Geoff Gordon/Ryan Tibshirani Scribes: Subhodeep Moitra Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have

More information

Convex Hodge Decomposition and Regularization of Image Flows

Convex Hodge Decomposition and Regularization of Image Flows Convex Hodge Decomposition and Regularization of Image Flows Jing Yuan, Christoph Schnörr, Gabriele Steidl April 14, 2008 Abstract The total variation (TV) measure is a key concept in the field of variational

More information

About Split Proximal Algorithms for the Q-Lasso

About Split Proximal Algorithms for the Q-Lasso Thai Journal of Mathematics Volume 5 (207) Number : 7 http://thaijmath.in.cmu.ac.th ISSN 686-0209 About Split Proximal Algorithms for the Q-Lasso Abdellatif Moudafi Aix Marseille Université, CNRS-L.S.I.S

More information

Proximal Newton Method. Zico Kolter (notes by Ryan Tibshirani) Convex Optimization

Proximal Newton Method. Zico Kolter (notes by Ryan Tibshirani) Convex Optimization Proximal Newton Method Zico Kolter (notes by Ryan Tibshirani) Convex Optimization 10-725 Consider the problem Last time: quasi-newton methods min x f(x) with f convex, twice differentiable, dom(f) = R

More information

Proximal Minimization by Incremental Surrogate Optimization (MISO)

Proximal Minimization by Incremental Surrogate Optimization (MISO) Proximal Minimization by Incremental Surrogate Optimization (MISO) (and a few variants) Julien Mairal Inria, Grenoble ICCOPT, Tokyo, 2016 Julien Mairal, Inria MISO 1/26 Motivation: large-scale machine

More information

A Primal-dual Three-operator Splitting Scheme

A Primal-dual Three-operator Splitting Scheme Noname manuscript No. (will be inserted by the editor) A Primal-dual Three-operator Splitting Scheme Ming Yan Received: date / Accepted: date Abstract In this paper, we propose a new primal-dual algorithm

More information

Accelerated primal-dual methods for linearly constrained convex problems

Accelerated primal-dual methods for linearly constrained convex problems Accelerated primal-dual methods for linearly constrained convex problems Yangyang Xu SIAM Conference on Optimization May 24, 2017 1 / 23 Accelerated proximal gradient For convex composite problem: minimize

More information

Steven Tilley Fully 3D Recon 2015 May 31 -June 4

Steven Tilley Fully 3D Recon 2015 May 31 -June 4 Fully3D Advanced System Models for Reconstruction in Flat-Panel Detector Cone-Beam CT Steven Tilley, Jeffrey Siewerdsen, Web Stayman Johns Hopkins University Schools of Medicine and Engineering Acknowledgements

More information

Sparse linear models and denoising

Sparse linear models and denoising Lecture notes 4 February 22, 2016 Sparse linear models and denoising 1 Introduction 1.1 Definition and motivation Finding representations of signals that allow to process them more effectively is a central

More information

Towards Proton Computed Tomography

Towards Proton Computed Tomography SCIPP Towards Proton Computed Tomography L. R. Johnson, B. Keeney, G. Ross, H. F.-W. Sadrozinski, A. Seiden, D.C. Williams, L. Zhang Santa Cruz Institute for Particle Physics, UC Santa Cruz, CA 95064 V.

More information

2 Regularized Image Reconstruction for Compressive Imaging and Beyond

2 Regularized Image Reconstruction for Compressive Imaging and Beyond EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement

More information

Fast and Accurate HARDI and its Application to Neurological Diagnosis

Fast and Accurate HARDI and its Application to Neurological Diagnosis Fast and Accurate HARDI and its Application to Neurological Diagnosis Dr. Oleg Michailovich Department of Electrical and Computer Engineering University of Waterloo June 21, 2011 Outline 1 Diffusion imaging

More information

A Localized Linearized ROF Model for Surface Denoising

A Localized Linearized ROF Model for Surface Denoising 1 2 3 4 A Localized Linearized ROF Model for Surface Denoising Shingyu Leung August 7, 2008 5 Abstract 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 1 Introduction CT/MRI scan becomes a very

More information

A Linearly Convergent First-order Algorithm for Total Variation Minimization in Image Processing

A Linearly Convergent First-order Algorithm for Total Variation Minimization in Image Processing A Linearly Convergent First-order Algorithm for Total Variation Minimization in Image Processing Cong D. Dang Kaiyu Dai Guanghui Lan October 9, 0 Abstract We introduce a new formulation for total variation

More information

Proximal Gradient Descent and Acceleration. Ryan Tibshirani Convex Optimization /36-725

Proximal Gradient Descent and Acceleration. Ryan Tibshirani Convex Optimization /36-725 Proximal Gradient Descent and Acceleration Ryan Tibshirani Convex Optimization 10-725/36-725 Last time: subgradient method Consider the problem min f(x) with f convex, and dom(f) = R n. Subgradient method:

More information

Solving DC Programs that Promote Group 1-Sparsity

Solving DC Programs that Promote Group 1-Sparsity Solving DC Programs that Promote Group 1-Sparsity Ernie Esser Contains joint work with Xiaoqun Zhang, Yifei Lou and Jack Xin SIAM Conference on Imaging Science Hong Kong Baptist University May 14 2014

More information

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences Journal of Information Hiding and Multimedia Signal Processing c 2016 ISSN 207-4212 Ubiquitous International Volume 7, Number 5, September 2016 Truncation Strategy of Tensor Compressive Sensing for Noisy

More information

Using ADMM and Soft Shrinkage for 2D signal reconstruction

Using ADMM and Soft Shrinkage for 2D signal reconstruction Using ADMM and Soft Shrinkage for 2D signal reconstruction Yijia Zhang Advisor: Anne Gelb, Weihong Guo August 16, 2017 Abstract ADMM, the alternating direction method of multipliers is a useful algorithm

More information

arxiv: v1 [physics.geo-ph] 23 Dec 2017

arxiv: v1 [physics.geo-ph] 23 Dec 2017 Statics Preserving Sparse Radon Transform Nasser Kazemi, Department of Physics, University of Alberta, kazemino@ualberta.ca Summary arxiv:7.087v [physics.geo-ph] 3 Dec 07 This paper develops a Statics

More information

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.

More information

arxiv: v1 [stat.ml] 22 Nov 2016

arxiv: v1 [stat.ml] 22 Nov 2016 Interpretable Recurrent Neural Networks Using Sequential Sparse Recovery arxiv:1611.07252v1 [stat.ml] 22 Nov 2016 Scott Wisdom 1, Thomas Powers 1, James Pitton 1,2, and Les Atlas 1 1 Department of Electrical

More information

A Dual Formulation of the TV-Stokes Algorithm for Image Denoising

A Dual Formulation of the TV-Stokes Algorithm for Image Denoising A Dual Formulation of the TV-Stokes Algorithm for Image Denoising Christoffer A. Elo, Alexander Malyshev, and Talal Rahman Department of Mathematics, University of Bergen, Johannes Bruns gate 12, 5007

More information

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Minru Bai(x T) College of Mathematics and Econometrics Hunan University Joint work with Xiongjun Zhang, Qianqian Shao June 30,

More information

Research Article Exact Interior Reconstruction with Cone-Beam CT

Research Article Exact Interior Reconstruction with Cone-Beam CT Biomedical Imaging Volume 2007, Article ID 10693, 5 pages doi:10.1155/2007/10693 Research Article Exact Interior Reconstruction with Cone-Beam CT Yangbo Ye, 1 Hengyong Yu, 2 and Ge Wang 2 1 Department

More information

Variational image restoration by means of wavelets: simultaneous decomposition, deblurring and denoising

Variational image restoration by means of wavelets: simultaneous decomposition, deblurring and denoising Variational image restoration by means of wavelets: simultaneous decomposition, deblurring and denoising I. Daubechies and G. Teschke December 2, 2004 Abstract Inspired by papers of Vese Osher [20] and

More information

Dual methods for the minimization of the total variation

Dual methods for the minimization of the total variation 1 / 30 Dual methods for the minimization of the total variation Rémy Abergel supervisor Lionel Moisan MAP5 - CNRS UMR 8145 Different Learning Seminar, LTCI Thursday 21st April 2016 2 / 30 Plan 1 Introduction

More information

Scaled gradient projection methods in image deblurring and denoising

Scaled gradient projection methods in image deblurring and denoising Scaled gradient projection methods in image deblurring and denoising Mario Bertero 1 Patrizia Boccacci 1 Silvia Bonettini 2 Riccardo Zanella 3 Luca Zanni 3 1 Dipartmento di Matematica, Università di Genova

More information

Sparsity Regularization for Image Reconstruction with Poisson Data

Sparsity Regularization for Image Reconstruction with Poisson Data Sparsity Regularization for Image Reconstruction with Poisson Data Daniel J. Lingenfelter a, Jeffrey A. Fessler a,andzhonghe b a Electrical Engineering and Computer Science, University of Michigan, Ann

More information

Sparse regression. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Sparse regression. Optimization-Based Data Analysis.   Carlos Fernandez-Granda Sparse regression Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 3/28/2016 Regression Least-squares regression Example: Global warming Logistic

More information

Distributed Inexact Newton-type Pursuit for Non-convex Sparse Learning

Distributed Inexact Newton-type Pursuit for Non-convex Sparse Learning Distributed Inexact Newton-type Pursuit for Non-convex Sparse Learning Bo Liu Department of Computer Science, Rutgers Univeristy Xiao-Tong Yuan BDAT Lab, Nanjing University of Information Science and Technology

More information