Regularizing inverse problems using sparsity-based signal models

Size: px
Start display at page:

Download "Regularizing inverse problems using sparsity-based signal models"

Transcription

1 Regularizing inverse problems using sparsity-based signal models Jeffrey A. Fessler William L. Root Professor of EECS EECS Dept., BME Dept., Dept. of Radiology University of Michigan fessler Work with Sai Ravishankar and Raj Nadakuditi CSP Seminar 31 Mar / 45

2 Disclosure Research support from GE Healthcare Supported in part by NIH grants P01 CA-87634, U01 EB Equipment support from Intel Corporation 2 / 45

3 Outline Why Low-dose X-ray CT imaging Accelerated MR imaging Other inverse problems How MAP estimation for inverse problems Classical regularization methods (some sparsity based) Contemporary regularization methods (all sparsity based) 3 / 45

4 Outline Why Low-dose X-ray CT imaging Accelerated MR imaging Other inverse problems How MAP estimation for inverse problems Classical regularization methods (some sparsity based) Contemporary regularization methods (all sparsity based) 4 / 45

5 Improving X-ray CT image reconstruction A picture is worth 1000 words (and perhaps several 1000 seconds of computation?) Thin-slice FBP ASIR (denoise) Statistical Seconds A bit longer Much longer Today s talk: less about computation, more about image quality Right image used edge-preserving regularization 5 / 45

6 Accelerating MR imaging (a) 4 under-sampled MR k- space (b) zero-filled reconstruction (c) compressed sensing reconstruction with TV regularization (d) adaptive dictionary learning regularization [1, Fig. 10] 6 / 45

7 Other ill-posed inverse problems y = Ax + ε compressed sensing (A random, wide) deblurring (restoration) (A Toeplitz, wide?) in-painting (A subset of rows of I) denoising (not ill posed) (A = I) 7 / 45

8 Outline Why Low-dose X-ray CT imaging Accelerated MR imaging Other inverse problems How MAP estimation for inverse problems Classical regularization methods (some sparsity based) Contemporary regularization methods (all sparsity based) 8 / 45

9 Outline Why Low-dose X-ray CT imaging Accelerated MR imaging Other inverse problems How MAP estimation for inverse problems Classical regularization methods (some sparsity based) Contemporary regularization methods (all sparsity based) 9 / 45

10 Inverse problems via MAP estimation Unknown image x System model p(y x) Data y Estimator ˆx If we have a prior p(x), then the MAP estimate is: ˆx = arg max x p(x y) = arg max log p(y x) + log p(x). x For gaussian measurement errors and linear model: log p(y x) 1 2 y Ax 2 W where y 2 W = y W y and W 1 = Cov{y x} is known (A from physics, W from statistics) 10 / 45

11 Priors for MAP estimation If all images x are plausible (have non-zero probability) then p(x) e R(x) = log p(x) R(x) (from fantasy / imagination / wishful thinking) MAP regularized weighted least-squares (WLS) estimation: ˆx = arg max x = arg min x log p(y x) + log p(x) 1 2 y Ax 2 W + R(x) A regularizer R(x), aka log prior, is essential for high-quality solutions to ill-conditioned / ill-posed inverse problems. Why ill-posed? High ambitions / 45

12 Example of ill-conditioned inverse problem Two-pixel, two-ray X-ray tomography model: y 1 x 1 x 2 y 2 A = [ ] x = [ x1 x 2 ] cond ( A A ) 400 A is (roughly) square - somewhat typical 1 x 2 log-likelihood log p(y x): y 1 x / 45

13 Subspace model: Alternative to regularization Assuming x lies in a sufficiently low-dimensional subspace could make an inverse problem well conditioned. x 2 x 1 [ ] 1 Assume x = Dz where D = and z R 1 1 (z has only one nonzero element so very sparse!?) Estimate coefficient(s): [ ẑ ] = arg min z y ADz 2 2, then ˆx = Dẑ, 2 where B AD = and cond ( B B ) = 1 which is perfect! / 45

14 Why not use subspace models? Candès and Romberg (2005) [2] used 22 (noiseless) projection views, each with 256 samples = 5632 measured values, vs = unknown pixels 1 Shepp-Logan Phantom dimensional subspace Subspace representation (using pixel basis) is undesirably coarse. 14 / 45

15 Outline Why Low-dose X-ray CT imaging Accelerated MR imaging Other inverse problems How MAP estimation for inverse problems Classical regularization methods (some sparsity based) Contemporary regularization methods (all sparsity based) 15 / 45

16 Classical regularizers Tikhonov regularization (IID gaussian prior) Roughness penalty (Basic MRF prior) Sparsity in ambient space Edge-preserving regularization Total-variation (TV) regularization Black-box denoiser like NLM 16 / 45

17 Tikhonov regularization x 2 R(x) = β x 2 2 x 1 Colors show equivalent (normalized) prior p(x) / p(0) = e R(x) Equivalent to IID gaussian prior on x Makes any ill-conditioned / ill-posed problem well conditioned Ignores correlations between pixels 17 / 45

18 Sparsity regularization in ambient space x 2 R(x) = β x 0 = β j I {x j 0} x 1 Approximate Bayesian interpretation Non-convex IID = also ignores correlations 18 / 45

19 Sparsity regularization: convex relaxation x 2 R(x) = β x 1 = β j x j x 1 Equivalent to IID Laplacian prior on x Also ignores correlations 19 / 45

20 Correlation Shepp-Logan Pixel Neighbors (dithered) 2 1 Shepp-Logan Phantom 1.05 xj x j Caution: Shepp-Logan phantom [3] was designed for testing non-bayesian methods, not for designing signal models. Q: What causes the spread?? 20 / 45

21 Edge-preserving regularization Neighboring pixels tend to have similar values except near edges: x 2 R(x) = β j ψ(x j x j 1 ) Potential function ψ: 1 x x 1 ψ(t) t/δ Equivalent to improper prior (agnostic to DC value) Accounts for spatial correlations, but only very locally Used clinically now for low-dose X-ray CT image reconstruction 21 / 45

22 Total-variation (TV) regularization Neighboring pixels tend to have similar values except near edges ( gradient sparsity ): R(x) = βtv(x) = β Cx 1 = β x j x j 1 j x 2 Potential function ψ: 1 x 1 ψ(t) t Equivalent to improper prior (agnostic to DC value) Accounts for correlations, but only very locally Well-suited to piece-wise constant Shepp-Logan phantom! Used in many academic publications / 45

23 Black-box denoiser as a regularizer Noisy image Denoiser Denoised image Example: Non-local means (NLM) Corresponding regularizer [4] [6]: R(x) = β 1 2 x NLM(x) 2 2 Encourages self-consistency with denoised version of image No evident Bayesian interpretation Variable splitting can facilitate minimization [7]. 23 / 45

24 Many more regularizers / priors Transforms: wavelets, curvelets,... Markov random field models Graphical models / 45

25 Outline Why Low-dose X-ray CT imaging Accelerated MR imaging Other inverse problems How MAP estimation for inverse problems Classical regularization methods (some sparsity based) Contemporary regularization methods (all sparsity based) 25 / 45

26 Contemporary regularizers Convolutional sparsity Union of subspaces Sparse coding with dictionary manifolds? [8] 26 / 45

27 Convolutional sparsity Idea: x[ n] K h k [ n] z k [ n] k=1 where each h k [ n] is a FIR filter with h k = 1 and each coefficient image z k [ n] is sparse [9] [11]. Equivalent matrix-vector representation: K x H k z k k=1 where H k is a Toeplitz (or circulant) matrix corresponding to h k. 27 / 45

28 Convolutional sparsity: example 0.5 Filter h 1 [n] 2 1 Coefficients z 1 [n] h 2 [n] 2 1 z 2 [n] 0.5 x[n] n 0.5 h 3 [n] 2 1 z 3 [n] x[ n] K k=1 h k [ n] z k [ n] / 45

29 Convolutional sparsity: regularizer Recall x K k=1 H k z k Natural corresponding regularizer: R(x) = min {z k } min {h k } h k = 1 K x 2 H k z k + λ 2 K z k 0 k=1 2 k=1 Adapts FIR filters {h k } and coefficients {z k } to candidate x. Literature focuses on the minimization problem (sparse coding) Yet to be explored as regularizer for inverse problems Inherently shift-invariant representation; no patches needed 29 / 45

30 Union of subspaces model x 2 x 1 Dimensionality reduction? cf. classification / clustering motivation [12] (Extension to union of flats (linear varieties) is possible [13].) 30 / 45

31 Union of subspaces regularization Given (?) collection of K subspace bases D 1,..., D K (dictionaries with full column rank): R(x) = min k }{{} classification = min k β 1 2 min z k β 1 2 x D kz k 2 2 } {{ } regression x D k D + k x 2 2 R(x) = 0 if x lies in the span of any of the dictionaries {D k }. otherwise, distance to nearest subspace (discourage, not constrain) Non-convex (highly?) (cf. preceding picture) Apply to image patches to be practical Equivalent Bayesian interpretation? (not a mixture model here) 31 / 45

32 Regularizers using sparse coding with dictionaries Assume x Dz where D is a dictionary (often over-complete) and z is a sparse coefficient vector. Corresponding regularizers: R(x) = min z : z p s β1 2 x Dz 2 2 ( ) 1 R(x) = min β 1 z 2 x Dz β 2 z p Convex in z (for given x) if p 1. R(x) typically non-convex in x. Could be equivalent to a union-of-subspaces regularizer if D = [D 1... D K ] and if we constrain coefficient vector z in a non-standard way. 32 / 45

33 Union-of-subspaces vs sparse-coding-with-dictionary Consider union-of-subspaces model with D 1 = So D 1 spans x-y plane and D 2 spans z-axis. A dictionary model with D = [D 1 D 2 ] = [ [ ], D 2 = and sparsity s = 2, happily represents all three cardinal planes ] [ ]. Thus dictionary model seems less constrained than union-of-subspaces model. (Still, focus on sparse dictionary representation hereafter.) 33 / 45

34 Dictionary learning from training data Given training data x 1,..., x N R d (image patches) Assumed model: x n Dz n unknown d J dictionary D = [d 1... d J ] coefficient vectors z 1,..., z N R J assumed sparse K-SVD dictionary learning formulation [14]: D = arg min D R d J = arg min D R d J N min x n Dz n 2 z n R J n=1 min X DZ F Z R J N s.t. s.t. d j = 1 j z n 0 s n d j = 1 j z n 0 s n X [x 1... x N ], Z [z 1... z N ] Computationally expensive and no convergence guarantees. Inherently non-convex due to product of unknowns DZ. 34 / 45

35 New dictionary learning method (SOUP-DIL) Joint work with Sai Ravishankar and Raj Nadakuditi [15] [18] Write sparse representation as Sum of OUter Products (SOUP): X DZ = DC = J j=1 d j c j where Z = C = [c 1... c J ] R N J (coefficients for each atom) Replace individual atom sparsity constraint z n 0 s with aggregate sparsity regularizer: Z 0 = C 0. Natural for DIctionary Learning (DIL) from training data Unnatural for image compression using sparse coding SOUP-DIL l 0 formulation: D = arg min D R d J min X DC 2 C R N J F + λ 2 C 0 s.t. d j 2 = 1 j c j L j 35 / 45

36 SOUP-DIL algorithm SOUP-DIL formulation: D = arg min D R d J min X DC 2 C R N J F + λ 2 C 0 s.t. d j 2 = 1 j c j L j Block coordinate descent (BCD) algorithm Sparse coding step for C Dictionary update step for D Very simple update rules (low compute cost) Monotone descent of Ψ(D, C) Convergence theorem: for any given initialization (D 0,C 0 ), all accumulation points of sequence (D,C) are critical points of cost Ψ and are equivalent { (reach same cost function value Ψ ). D Furthermore: (k) D (k 1) } { 0. Same for C (k)}. 36 / 45

37 SOUP-DIL updates D = arg min D R d J min X DC 2 C R N J F + λ 2 C 0 s.t. d j 2 = 1 j c j L j Alternate: update one column d j of D then one column c j of C. Sparse coding step: update c j using residual E j k j d kc k min c j E j d j c j 2 F + λ 2 c j 0 s.t. c j L Truncated (via L) hard thresholding of E jd j with threshold λ Dictionary atom step: update d j min d j E j d j c j 2 F s.t. d j 2 = 1 Constrained least-squares solution: d j = (E j c j )/ E j c j 2 37 / 45

38 Truncated hard thresholding for SOUP-DIL L c j R N λ λ L E jd j R N (Acts element-wise.) (In practice take L =.) (Algorithm also provides a simple sparse coding method.) 38 / 45

39 Example: dictionary learning for Barbara Barbara K-SVD D SOUP-DIL D Denoising PSNR (db) from [15] σ Noisy O-DCT K-SVD SOUP-DIL SOUP-DIL faster than K-SVD 39 / 45

40 Regularization using SOUP-DIL Large image x, extract M patches X = [P 1 x... P M x] Assume patch x m = P m x Dz m has (aggregate) sparse representation in dictionary D R d J where d is patch size R(x) = R(X) = min C R M J X DC 2 F +λ 2 C 0 s.t. c j L j R(x) = 0 if patches can be represented exactly with sufficiently few non-zero coefficients (depends on λ) Ignore constraint c j L Bayesian interpretation? 40 / 45

41 CT reconstruction using (known) dictionary regularizer ˆx = arg min x = arg min x 1 2 y Ax 2 W + β R(x) 1 min C R M J 2 y Ax 2 W + β ) ( X DC 2 F + λ 2 C 0 2 Alternating (nested) minimization: Fixing x, updating each column of C sequentially involves (truncated?) hard-thresholding Fixing C, updating x is (large-scale) quadratic problem g(x) = 1 2 X DC 2 F = M m=1 1 2 P m x DP m C 2 2 Work in progress... M 2 g(x) = P mp m m=1 is diagonal 41 / 45

42 Why Bayesian? Numerous normal-dose CT images! learn D, or most of it, from big data learn statistics of sparse coefficients Z? replace generic Z 0 with p(z)? 42 / 45

43 Extensions / future work Use majorization to update multiple columns of D or C simultaneously DC atom Rotate/flip atoms [19] [20] rank constraints on dictionary atoms [16] Tensor structured atoms for 3D / dynamic imaging Combined transform learning / dictionary learning Union of manifolds instead of union of subspaces?... Open problems Model selection Parameter selection Performance guarantees 43 / 45

44 Bibliography I [1] S. Ravishankar and Y. Bresler, MR image reconstruction from highly undersampled k-space data by dictionary learning, IEEE Trans. Med. Imag., vol. 30, no. 5, , May [2] E. Candès and J. K. Romberg, Signal recovery from random projections, in Proc. SPIE 5674 Computational Imaging III, 2005, [3] L. A. Shepp and B. F. Logan, The Fourier reconstruction of a head section, IEEE Trans. Nuc. Sci., vol. 21, no. 3, 21 43, Jun [4] M. Mignotte, A non-local regularization strategy for image deconvolution, Pattern Recognition Letters, vol. 29, no. 16, , Dec [5] Z. Yang and M. Jacob, Nonlocal regularization of inverse problems: A unified variational framework, IEEE Trans. Im. Proc., vol. 22, no. 8, , Aug [6] H. Zhang, J. Ma, J. Wang, Y. Liu, H. Lu, and Z. Liang, Statistical image reconstruction for low-dose CT using nonlocal means-based regularization, Computerized Medical Imaging and Graphics, vol. 38, no. 6, , Sep [7] S. Y. Chun, Y. K. Dewaraja, and J. A. Fessler, Alternating direction method of multiplier for tomography with non-local regularizers, IEEE Trans. Med. Imag., vol. 33, no. 10, , Oct [8] S. Poddar and M. Jacob, Dynamic MRI using smoothness regularization on manifolds (SToRM), IEEE Trans. Med. Imag., [9] J. Yang, K. Yu, and T. Huang, Supervised translation-invariant sparse coding, in Proc. IEEE Conf. on Comp. Vision and Pattern Recognition, 2010, [10] H. Bristow, A. Eriksson, and S. Lucey, Fast convolutional sparse coding, in Proc. IEEE Conf. on Comp. Vision and Pattern Recognition, 2013, [11] B. Wohlberg, Efficient convolutional sparse coding, in Proc. IEEE Conf. Acoust. Speech Sig. Proc., 2014, / 45

45 Bibliography II [12] R. Vidal, Subspace clustering, IEEE Sig. Proc. Mag., vol. 28, no. 2, 52 68, Mar [13] T. Zhang, A. Szlam, and G. Lerman, Median K-Flats for hybrid linear modeling with many outliers, in Proc. Intl. Conf. Comp. Vision, 2009, [14] M. Aharon, M. Elad, and A. Bruckstein, K-SVD: an algorithm for designing overcomplete dictionaries for sparse representation, IEEE Trans. Sig. Proc., vol. 54, no. 11, , Nov [15] S. Ravishankar, R. R. Nadakuditi, and J. A. Fessler, Efficient sum of outer products dictionary learning (SOUP-DIL) - The l 0 method, IEEE Trans. Sig. Proc., 2015, Submitted. [16], Efficient learning of dictionaries with low-rank atoms, in Proc. IEEE Intl. Conf. on Image Processing, Submitted., [17], Efficient l0 dictionary learning with convergence guarantees, in Proc. Intl. Conf. Mach. Learn, Submitted., [18],Efficient sum of outer products dictionary learning (SOUP-DIL) - The l 0 method, arxiv , [19] B. Wen, S. Ravishankar, and Y. Bresler, FRIST - flipping and rotation invariant sparsifying transform learning and applications of inverse problems, [20] B. Wen, Y. Bresler, and S. Ravishankar, FRIST - flipping and rotation invariant sparsifying transform learning and applications, arxiv , / 45

Sparsifying Transform Learning for Compressed Sensing MRI

Sparsifying Transform Learning for Compressed Sensing MRI Sparsifying Transform Learning for Compressed Sensing MRI Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and Coordinated Science Laborarory University of Illinois

More information

Optimized first-order minimization methods

Optimized first-order minimization methods Optimized first-order minimization methods Donghwan Kim & Jeffrey A. Fessler EECS Dept., BME Dept., Dept. of Radiology University of Michigan web.eecs.umich.edu/~fessler UM AIM Seminar 2014-10-03 1 Disclosure

More information

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and the Coordinated Science Laboratory, University

More information

Relaxed linearized algorithms for faster X-ray CT image reconstruction

Relaxed linearized algorithms for faster X-ray CT image reconstruction Relaxed linearized algorithms for faster X-ray CT image reconstruction Hung Nien and Jeffrey A. Fessler University of Michigan, Ann Arbor The 13th Fully 3D Meeting June 2, 2015 1/20 Statistical image reconstruction

More information

Efficient Data-Driven Learning of Sparse Signal Models and Its Applications

Efficient Data-Driven Learning of Sparse Signal Models and Its Applications Efficient Data-Driven Learning of Sparse Signal Models and Its Applications Saiprasad Ravishankar Department of Electrical Engineering and Computer Science University of Michigan, Ann Arbor Dec 10, 2015

More information

EUSIPCO

EUSIPCO EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

2D X-Ray Tomographic Reconstruction From Few Projections

2D X-Ray Tomographic Reconstruction From Few Projections 2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory

More information

Sparse & Redundant Signal Representation, and its Role in Image Processing

Sparse & Redundant Signal Representation, and its Role in Image Processing Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique

More information

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

Compressed Sensing and Related Learning Problems

Compressed Sensing and Related Learning Problems Compressed Sensing and Related Learning Problems Yingzhen Li Dept. of Mathematics, Sun Yat-sen University Advisor: Prof. Haizhang Zhang Advisor: Prof. Haizhang Zhang 1 / Overview Overview Background Compressed

More information

Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems)

Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems) Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems) Donghwan Kim and Jeffrey A. Fessler EECS Department, University of Michigan

More information

Color Scheme. swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July / 14

Color Scheme.   swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July / 14 Color Scheme www.cs.wisc.edu/ swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July 2016 1 / 14 Statistical Inference via Optimization Many problems in statistical inference

More information

Accelerated MRI Image Reconstruction

Accelerated MRI Image Reconstruction IMAGING DATA EVALUATION AND ANALYTICS LAB (IDEAL) CS5540: Computational Techniques for Analyzing Clinical Data Lecture 15: Accelerated MRI Image Reconstruction Ashish Raj, PhD Image Data Evaluation and

More information

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München, München, Arcistraße

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Machine Learning for Signal Processing Sparse and Overcomplete Representations Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Design of Projection Matrix for Compressive Sensing by Nonsmooth Optimization

Design of Projection Matrix for Compressive Sensing by Nonsmooth Optimization Design of Proection Matrix for Compressive Sensing by Nonsmooth Optimization W.-S. Lu T. Hinamoto Dept. of Electrical & Computer Engineering Graduate School of Engineering University of Victoria Hiroshima

More information

Sparse Approximation and Variable Selection

Sparse Approximation and Variable Selection Sparse Approximation and Variable Selection Lorenzo Rosasco 9.520 Class 07 February 26, 2007 About this class Goal To introduce the problem of variable selection, discuss its connection to sparse approximation

More information

Structured matrix factorizations. Example: Eigenfaces

Structured matrix factorizations. Example: Eigenfaces Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix

More information

Wavelet Based Image Restoration Using Cross-Band Operators

Wavelet Based Image Restoration Using Cross-Band Operators 1 Wavelet Based Image Restoration Using Cross-Band Operators Erez Cohen Electrical Engineering Department Technion - Israel Institute of Technology Supervised by Prof. Israel Cohen 2 Layout Introduction

More information

Greedy Dictionary Selection for Sparse Representation

Greedy Dictionary Selection for Sparse Representation Greedy Dictionary Selection for Sparse Representation Volkan Cevher Rice University volkan@rice.edu Andreas Krause Caltech krausea@caltech.edu Abstract We discuss how to construct a dictionary by selecting

More information

Learning Sparsifying Transforms

Learning Sparsifying Transforms 1 Learning Sparsifying Transforms Saiprasad Ravishankar, Student Member, IEEE, and Yoram Bresler, Fellow, IEEE Abstract The sparsity of signals and images in a certain transform domain or dictionary has

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Minimizing Isotropic Total Variation without Subiterations

Minimizing Isotropic Total Variation without Subiterations MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Minimizing Isotropic Total Variation without Subiterations Kamilov, U. S. TR206-09 August 206 Abstract Total variation (TV) is one of the most

More information

EE 381V: Large Scale Optimization Fall Lecture 24 April 11

EE 381V: Large Scale Optimization Fall Lecture 24 April 11 EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that

More information

A Multi-task Learning Strategy for Unsupervised Clustering via Explicitly Separating the Commonality

A Multi-task Learning Strategy for Unsupervised Clustering via Explicitly Separating the Commonality A Multi-task Learning Strategy for Unsupervised Clustering via Explicitly Separating the Commonality Shu Kong, Donghui Wang Dept. of Computer Science and Technology, Zhejiang University, Hangzhou 317,

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

Continuous State MRF s

Continuous State MRF s EE64 Digital Image Processing II: Purdue University VISE - December 4, Continuous State MRF s Topics to be covered: Quadratic functions Non-Convex functions Continuous MAP estimation Convex functions EE64

More information

Sparse Subspace Clustering

Sparse Subspace Clustering Sparse Subspace Clustering Based on Sparse Subspace Clustering: Algorithm, Theory, and Applications by Elhamifar and Vidal (2013) Alex Gutierrez CSCI 8314 March 2, 2017 Outline 1 Motivation and Background

More information

SPARSE SIGNAL RESTORATION. 1. Introduction

SPARSE SIGNAL RESTORATION. 1. Introduction SPARSE SIGNAL RESTORATION IVAN W. SELESNICK 1. Introduction These notes describe an approach for the restoration of degraded signals using sparsity. This approach, which has become quite popular, is useful

More information

DNNs for Sparse Coding and Dictionary Learning

DNNs for Sparse Coding and Dictionary Learning DNNs for Sparse Coding and Dictionary Learning Subhadip Mukherjee, Debabrata Mahapatra, and Chandra Sekhar Seelamantula Department of Electrical Engineering, Indian Institute of Science, Bangalore 5612,

More information

Optimization transfer approach to joint registration / reconstruction for motion-compensated image reconstruction

Optimization transfer approach to joint registration / reconstruction for motion-compensated image reconstruction Optimization transfer approach to joint registration / reconstruction for motion-compensated image reconstruction Jeffrey A. Fessler EECS Dept. University of Michigan ISBI Apr. 5, 00 0 Introduction Image

More information

Linear Inverse Problems

Linear Inverse Problems Linear Inverse Problems Ajinkya Kadu Utrecht University, The Netherlands February 26, 2018 Outline Introduction Least-squares Reconstruction Methods Examples Summary Introduction 2 What are inverse problems?

More information

CSC 576: Variants of Sparse Learning

CSC 576: Variants of Sparse Learning CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in

More information

Inverse Problems in Image Processing

Inverse Problems in Image Processing H D Inverse Problems in Image Processing Ramesh Neelamani (Neelsh) Committee: Profs. R. Baraniuk, R. Nowak, M. Orchard, S. Cox June 2003 Inverse Problems Data estimation from inadequate/noisy observations

More information

A discretized Newton flow for time varying linear inverse problems

A discretized Newton flow for time varying linear inverse problems A discretized Newton flow for time varying linear inverse problems Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München Arcisstrasse

More information

Image Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang

Image Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang Image Noise: Detection, Measurement and Removal Techniques Zhifei Zhang Outline Noise measurement Filter-based Block-based Wavelet-based Noise removal Spatial domain Transform domain Non-local methods

More information

Signal Recovery, Uncertainty Relations, and Minkowski Dimension

Signal Recovery, Uncertainty Relations, and Minkowski Dimension Signal Recovery, Uncertainty Relations, and Minkowski Dimension Helmut Bőlcskei ETH Zurich December 2013 Joint work with C. Aubel, P. Kuppinger, G. Pope, E. Riegler, D. Stotz, and C. Studer Aim of this

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Overview. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Overview. Optimization-Based Data Analysis.   Carlos Fernandez-Granda Overview Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 1/25/2016 Sparsity Denoising Regression Inverse problems Low-rank models Matrix completion

More information

A NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES

A NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES A NEW FRAMEWORK FOR DESIGNING INCOERENT SPARSIFYING DICTIONARIES Gang Li, Zhihui Zhu, 2 uang Bai, 3 and Aihua Yu 3 School of Automation & EE, Zhejiang Univ. of Sci. & Tech., angzhou, Zhejiang, P.R. China

More information

Bayesian Methods for Sparse Signal Recovery

Bayesian Methods for Sparse Signal Recovery Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery

More information

A FISTA-like scheme to accelerate GISTA?

A FISTA-like scheme to accelerate GISTA? A FISTA-like scheme to accelerate GISTA? C. Cloquet 1, I. Loris 2, C. Verhoeven 2 and M. Defrise 1 1 Dept. of Nuclear Medicine, Vrije Universiteit Brussel 2 Dept. of Mathematics, Université Libre de Bruxelles

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations

More information

Manifold Regularization

Manifold Regularization 9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,

More information

The Analysis Cosparse Model for Signals and Images

The Analysis Cosparse Model for Signals and Images The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under

More information

ECE521 lecture 4: 19 January Optimization, MLE, regularization

ECE521 lecture 4: 19 January Optimization, MLE, regularization ECE521 lecture 4: 19 January 2017 Optimization, MLE, regularization First four lectures Lectures 1 and 2: Intro to ML Probability review Types of loss functions and algorithms Lecture 3: KNN Convexity

More information

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Zhilin Zhang and Ritwik Giri Motivation Sparse Signal Recovery is an interesting

More information

SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES.

SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES. SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES Mostafa Sadeghi a, Mohsen Joneidi a, Massoud Babaie-Zadeh a, and Christian Jutten b a Electrical Engineering Department,

More information

Scan Time Optimization for Post-injection PET Scans

Scan Time Optimization for Post-injection PET Scans Presented at 998 IEEE Nuc. Sci. Symp. Med. Im. Conf. Scan Time Optimization for Post-injection PET Scans Hakan Erdoğan Jeffrey A. Fessler 445 EECS Bldg., University of Michigan, Ann Arbor, MI 4809-222

More information

Analytical Approach to Regularization Design for Isotropic Spatial Resolution

Analytical Approach to Regularization Design for Isotropic Spatial Resolution Analytical Approach to Regularization Design for Isotropic Spatial Resolution Jeffrey A Fessler, Senior Member, IEEE Abstract In emission tomography, conventional quadratic regularization methods lead

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

Noisy Word Recognition Using Denoising and Moment Matrix Discriminants

Noisy Word Recognition Using Denoising and Moment Matrix Discriminants Noisy Word Recognition Using Denoising and Moment Matrix Discriminants Mila Nikolova Département TSI ENST, rue Barrault, 753 Paris Cedex 13, France, nikolova@tsi.enst.fr Alfred Hero Dept. of EECS, Univ.

More information

SOS Boosting of Image Denoising Algorithms

SOS Boosting of Image Denoising Algorithms SOS Boosting of Image Denoising Algorithms Yaniv Romano and Michael Elad The Technion Israel Institute of technology Haifa 32000, Israel The research leading to these results has received funding from

More information

A Graph Cut Algorithm for Generalized Image Deconvolution

A Graph Cut Algorithm for Generalized Image Deconvolution A Graph Cut Algorithm for Generalized Image Deconvolution Ashish Raj UC San Francisco San Francisco, CA 94143 Ramin Zabih Cornell University Ithaca, NY 14853 Abstract The goal of deconvolution is to recover

More information

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Asilomar 2011 Jason T. Parker (AFRL/RYAP) Philip Schniter (OSU) Volkan Cevher (EPFL) Problem Statement Traditional

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University

More information

Lecture Notes 5: Multiresolution Analysis

Lecture Notes 5: Multiresolution Analysis Optimization-based data analysis Fall 2017 Lecture Notes 5: Multiresolution Analysis 1 Frames A frame is a generalization of an orthonormal basis. The inner products between the vectors in a frame and

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

Preconditioning. Noisy, Ill-Conditioned Linear Systems

Preconditioning. Noisy, Ill-Conditioned Linear Systems Preconditioning Noisy, Ill-Conditioned Linear Systems James G. Nagy Emory University Atlanta, GA Outline 1. The Basic Problem 2. Regularization / Iterative Methods 3. Preconditioning 4. Example: Image

More information

Minimizing the Difference of L 1 and L 2 Norms with Applications

Minimizing the Difference of L 1 and L 2 Norms with Applications 1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in

More information

Compressed sensing and imaging

Compressed sensing and imaging Compressed sensing and imaging The effect and benefits of local structure Ben Adcock Department of Mathematics Simon Fraser University 1 / 45 Overview Previously: An introduction to compressed sensing.

More information

Templates, Image Pyramids, and Filter Banks

Templates, Image Pyramids, and Filter Banks Templates, Image Pyramids, and Filter Banks 09/9/ Computer Vision James Hays, Brown Slides: Hoiem and others Review. Match the spatial domain image to the Fourier magnitude image 2 3 4 5 B A C D E Slide:

More information

A tutorial on sparse modeling. Outline:

A tutorial on sparse modeling. Outline: A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing

More information

ISyE 691 Data mining and analytics

ISyE 691 Data mining and analytics ISyE 691 Data mining and analytics Regression Instructor: Prof. Kaibo Liu Department of Industrial and Systems Engineering UW-Madison Email: kliu8@wisc.edu Office: Room 3017 (Mechanical Engineering Building)

More information

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Sparse Recovery using L1 minimization - algorithms Yuejie Chi Department of Electrical and Computer Engineering Spring

More information

Phase Retrieval from Local Measurements: Deterministic Measurement Constructions and Efficient Recovery Algorithms

Phase Retrieval from Local Measurements: Deterministic Measurement Constructions and Efficient Recovery Algorithms Phase Retrieval from Local Measurements: Deterministic Measurement Constructions and Efficient Recovery Algorithms Aditya Viswanathan Department of Mathematics and Statistics adityavv@umich.edu www-personal.umich.edu/

More information

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences Journal of Information Hiding and Multimedia Signal Processing c 2016 ISSN 207-4212 Ubiquitous International Volume 7, Number 5, September 2016 Truncation Strategy of Tensor Compressive Sensing for Noisy

More information

Recent developments on sparse representation

Recent developments on sparse representation Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last

More information

Fast Angular Synchronization for Phase Retrieval via Incomplete Information

Fast Angular Synchronization for Phase Retrieval via Incomplete Information Fast Angular Synchronization for Phase Retrieval via Incomplete Information Aditya Viswanathan a and Mark Iwen b a Department of Mathematics, Michigan State University; b Department of Mathematics & Department

More information

Simultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors

Simultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors Simultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors Sean Borman and Robert L. Stevenson Department of Electrical Engineering, University of Notre Dame Notre Dame,

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector

More information

LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING

LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING JIAN-FENG CAI, STANLEY OSHER, AND ZUOWEI SHEN Abstract. Real images usually have sparse approximations under some tight frame systems derived

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

Preconditioning. Noisy, Ill-Conditioned Linear Systems

Preconditioning. Noisy, Ill-Conditioned Linear Systems Preconditioning Noisy, Ill-Conditioned Linear Systems James G. Nagy Emory University Atlanta, GA Outline 1. The Basic Problem 2. Regularization / Iterative Methods 3. Preconditioning 4. Example: Image

More information

Learning MMSE Optimal Thresholds for FISTA

Learning MMSE Optimal Thresholds for FISTA MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Learning MMSE Optimal Thresholds for FISTA Kamilov, U.; Mansour, H. TR2016-111 August 2016 Abstract Fast iterative shrinkage/thresholding algorithm

More information

Bayesian Paradigm. Maximum A Posteriori Estimation

Bayesian Paradigm. Maximum A Posteriori Estimation Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)

More information

Compressive Sensing of Sparse Tensor

Compressive Sensing of Sparse Tensor Shmuel Friedland Univ. Illinois at Chicago Matheon Workshop CSA2013 December 12, 2013 joint work with Q. Li and D. Schonfeld, UIC Abstract Conventional Compressive sensing (CS) theory relies on data representation

More information

Sparsity and Morphological Diversity in Source Separation. Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France

Sparsity and Morphological Diversity in Source Separation. Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France Sparsity and Morphological Diversity in Source Separation Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France Collaborators - Yassir Moudden - CEA Saclay, France - Jean-Luc Starck - CEA

More information

694 IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, VOL. 3, NO. 4, DECEMBER 2017

694 IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, VOL. 3, NO. 4, DECEMBER 2017 694 IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, VOL. 3, NO. 4, DECEMBER 2017 Efficient Sum of Outer Products Dictionary Learning (SOUP-DIL) and Its Application to Inverse Problems Saiprasad Ravishankar,

More information

Applied Machine Learning for Biomedical Engineering. Enrico Grisan

Applied Machine Learning for Biomedical Engineering. Enrico Grisan Applied Machine Learning for Biomedical Engineering Enrico Grisan enrico.grisan@dei.unipd.it Data representation To find a representation that approximates elements of a signal class with a linear combination

More information

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Introduction to Sparsity Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Outline Understanding Sparsity Total variation Compressed sensing(definition) Exact recovery with sparse prior(l 0 ) l 1 relaxation

More information

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS Eva L. Dyer, Christoph Studer, Richard G. Baraniuk Rice University; e-mail: {e.dyer, studer, richb@rice.edu} ABSTRACT Unions of subspaces have recently been

More information

Why Sparse Coding Works

Why Sparse Coding Works Why Sparse Coding Works Mathematical Challenges in Deep Learning Shaowei Lin (UC Berkeley) shaowei@math.berkeley.edu 10 Aug 2011 Deep Learning Kickoff Meeting What is Sparse Coding? There are many formulations

More information

Residual Correlation Regularization Based Image Denoising

Residual Correlation Regularization Based Image Denoising IEEE SIGNAL PROCESSING LETTERS, VOL. XX, NO. XX, MONTH 20 1 Residual Correlation Regularization Based Image Denoising Gulsher Baloch, Huseyin Ozkaramanli, and Runyi Yu Abstract Patch based denoising algorithms

More information

Sparse representation classification and positive L1 minimization

Sparse representation classification and positive L1 minimization Sparse representation classification and positive L1 minimization Cencheng Shen Joint Work with Li Chen, Carey E. Priebe Applied Mathematics and Statistics Johns Hopkins University, August 5, 2014 Cencheng

More information

SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS

SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS TR-IIS-4-002 SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS GUAN-JU PENG AND WEN-LIANG HWANG Feb. 24, 204 Technical Report No. TR-IIS-4-002 http://www.iis.sinica.edu.tw/page/library/techreport/tr204/tr4.html

More information

A Generative Perspective on MRFs in Low-Level Vision Supplemental Material

A Generative Perspective on MRFs in Low-Level Vision Supplemental Material A Generative Perspective on MRFs in Low-Level Vision Supplemental Material Uwe Schmidt Qi Gao Stefan Roth Department of Computer Science, TU Darmstadt 1. Derivations 1.1. Sampling the Prior We first rewrite

More information

Numerical Methods. Rafał Zdunek Underdetermined problems (2h.) Applications) (FOCUSS, M-FOCUSS,

Numerical Methods. Rafał Zdunek Underdetermined problems (2h.) Applications) (FOCUSS, M-FOCUSS, Numerical Methods Rafał Zdunek Underdetermined problems (h.) (FOCUSS, M-FOCUSS, M Applications) Introduction Solutions to underdetermined linear systems, Morphological constraints, FOCUSS algorithm, M-FOCUSS

More information

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France. Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems

More information

EFFICIENT LEARNING OF DICTIONARIES WITH LOW-RANK ATOMS

EFFICIENT LEARNING OF DICTIONARIES WITH LOW-RANK ATOMS EICIENT LEARNING O DICTIONARIES WITH LOW-RANK ATOMS Saiprasad Ravishankar, Brian E. Moore, Raj Rao Nadakuditi, and Jeffrey A. essler Department of Electrical Engineering and Computer Science, University

More information

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information