Sparsifying Transform Learning for Compressed Sensing MRI

Size: px
Start display at page:

Download "Sparsifying Transform Learning for Compressed Sensing MRI"

Transcription

1 Sparsifying Transform Learning for Compressed Sensing MRI Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and Coordinated Science Laborarory University of Illinois at Urbana-Champaign April 8, 2013

2 Outline Why compressed sensing MRI (CSMRI)? Nonadaptive CSMRI Synthesis Dictionary Learning MRI Transform vs. Synthesis Model Transform Learning MRI Formulations Algorithms Results (Static MRI) Conclusions

3 Motivation for Compressed Sensing MRI Data are samples in k-space, acquired sequentially in time. Acquisition rate limited by MR physics, etc. CS allows recovery of images from limited measurements Sparsity in transform domain or dictionary Acquisition incoherent with sparse model Reconstruction non-linear, non-convex Fig. from Lustig et al. 07

4 Compressed Sensing MRI (Nonadaptive) min x F u x y 2 2 +λ Ψx 1 (1) x C P - Image as vector, y C m - measurements. F u C m P - Undersampled Fourier encoding matrix (m < P). Ψ C T P - global, orthonormal transform. Total Variation penalty also added to (1) [Lustig et al. 07]. CSMRI with non-adaptive transforms limited to fold undersampling [Ma et al. 08].

5 Synthesis Dictionary Learning The DL problem - min R x Dα 2 2 s.t. α 0 s (2) D,{α } R C n P extracts n n patch of x. D C n K - patch-based dictionary. α C K - sparse, R x Dα. s - sparsity level. DL problem is NP-hard. Algorithms such as K-SVD 1 alternate between finding D and {α }. 1 [Aharon et al. 06]

6 Learning Dictionaries from Undersampled Data (DLMRI) 2 (P0) min x,d,{α } Sparse Fitting {}}{ R x Dα 2 2 +ν F ux y 2 2 }{{} Data Fidelity s.t. α 0 s. (P0) learns D, and reconstructs x, from only undersampled y. But, (P0) NP-hard, non-convex even if l 0 -quasinorm relaxed to l 1. DLMRI solves (P0) by alternating between DL (solving for D,{α }) and reconstruction update (solving for x). 2 [Ravishankar & Bresler 11]

7 2D Random Sampling Example - 6x undersampling LDP 3 reconstruction (22 db) LDP error magnitude DLMRI reconstruction (32 db) DLMRI error magnitude 0 Data from Miki Lustig, UC Berkeley. 3 LDP - Lustig, Donoho, and Pauly ( 07).

8 Drawbacks of DLMRI DLMRI computations do not scale well O(Kn 2 P) for a P-pixel image, and D C n K. Cost dominated by dictionary learning, particularly sparse coding, which by itself is an NP-hard problem. DL algorithms such as K-SVD can get stuck in bad local minima or even saddle points. Can we learn better, more efficient sparse models for MR images?

9 Synthesis Model for Sparse Representation Given a signal y C n, and dictionary D C n K, we assume y = Dx with x 0 K. Real world signals modeled as y = Dx +e, e is deviation term. Given D, and sparsity level s, ˆx = argmin x y Dx 2 2 s.t. x 0 s This is the NP-hard synthesis sparse coding problem. Greedy and l 1 -relaxation algorithms are computationally expensive.

10 Transform Model for Sparse Representation Given a signal y C n, and transform W C m n, we model Wy = x +η with x 0 m and η - error term. Natural images are approximately sparse in Wavelets, DCT. Given W, and sparsity s, transform sparse coding is ˆx = argmin x Wy x 2 2 s.t. x 0 s ˆx computed exactly by thresholding Wy. Sparse coding is cheap! Signal recovered as W ˆx. Sparsifying transforms exploited for compression (JPEG2000), etc.

11 Square Transform Learning (P1) min W,{α } Sparsification Error {}}{ WR x α 2 2 λ s.t. α 0 s Regularizers {( }} ){ log detw W 2 F Sparsification error - measures deviation of patch in transform domain from perfect sparsity. λ > 0. The log detw restricts solution to full rank transforms. W 2 F keeps obective function bounded from below. ( ) Minimizing λ log detw W 2 F encourages reduction of condition number. The solution to (P1) is perfectly conditioned (κ = 1) as λ. (P1) is non-convex.

12 Transform Learning (TL) Algorithm Algorithm for (P1) alternates between updating {α } and W. Sparse Coding Step solves (P1) with fixed W. min WR x α 2 {α } 2 s.t. α 0 s (3) Easy problem - Solution ˆα computed exactly by thresholding WR x, and retaining s largest magnitude coefficients. Transform Update Step solves (P1) with fixed α s. min WR x α 2 W 2 λ log detw +µ W 2 F (4) Closed-form solution: Ŵ = U 2 (Σ+ ( ) Σ 2 )1 2 +2λI n Q H L 1, where ( ) R xx H R H +µi n = LL H, and L 1 R xα H = QΣU H.

13 TL Properties Obective converges for our exact alternating algorithm. Empirical evidence suggests convergence to same obective value regardless of initialization. Computational cost of TL : O(MNn 2 ) for N training signals, M iterations, and W C n n is significantly lower than cost for DL : O(MNn 3 ). Reduction in order by n for n n patch. Large values of λ enforce well-conditioning of transform.

14 Transform Learning MRI (TLMRI) (P2) min x,w,{α } WR x α 2 2 +λq(w)+ν F ux y 2 2 s.t. α 0 s. Similar to DLMRI formulation, but uses transform model. Q(W) = log detw + W 2 F. We modify (P2) by introducing extra variables ˆx in a penalty-type formulation, which leads to efficient algorithms. (P2) min Wˆx α 2 2 +λq(w)+ν F ux y 2 2 x,w,{ˆx },{α } +τ R x ˆx 2 2 s.t. α 0 s. Penalty R x ˆx 2 2 will also help us adaptively choose sparsity s.

15 TLMRI Algorithm - Denoising Step (P2) solved using alternating minimization. For given image x (corrupted), (P2) reduces to a denoising problem, with ˆx the denoised patches. Denoising Step - (P3) min W,{ˆx },{α } Wˆx α 2 2 +λq(w)+τ s.t. α 0 s. R x ˆx 2 2 Denoising involves: Transform learning (solve for W, {α } with fixed ˆx = WR x, s = s). Variable sparsity patch update (solving for ˆx and s ).

16 TLMRI Algorithm - Denoising Step The variable sparsity patch update involves solving (P3a) min Wˆx H s (WR x) 2 {ˆx } 2 +τ R x ˆx 2 2 H s (b) thresholds to s largest elements of b C n. For fixed s, (P3a) reduces to separate least squares problems in each ˆx. As s ր n, the denoising error R x ˆx LS 2 ց 0, with ˆxLS 2 the least squares solution for specific s. We pick s so that the error is below a threshold C - can be done efficiently [Ravishankar & Bresler 12]. C decreases over iterations, as iterates become more refined.

17 TLMRI Algorithm - Reconstruction Update Step Reconstruction Update Step - (P4) min x τ R x ˆx 2 2 +ν F ux y 2 2 Update performed directly in k-space { S(k x,k y ), (k x,k y ) / Ω Fx (k x,k y ) = S(k x,k y)+ν S 0(k x,k y) 1+ν, (k x,k y ) Ω (5) Fx(k x,k y ) - updated k-space value, S 0 (k x,k y ) - measured value, Ω- subset of k-space sampled. S = F RH ˆx β, ν = ν τβ (β - Number of patches covering any pixel).

18 TLMRI Algorithm Properties Every step of our algorithm involves efficient closed-form solutions. Per-iteration computational cost of TLMRI is lower than that of DLMRI in order by factor n (patch size).

19 Cartesian Sampling with 4x undersampling 33 PSNR TLMRI DLMRI LDP Zero Filling Iteration Number Original Image Zero-filling recon. PSNR vs. Iterations (PSNR = db) TLMRI with square transform is better and 12x faster than 4x overcomplete DLMRI with 6 6 patches. TLMRI significantly better and also faster than LDP that employs fixed transforms.

20 Cartesian Sampling with 4x undersampling TLMRI recon. DLMRI recon. (PSNR = db) (PSNR = db) TLMRI Error 0 DLMRI Error 0

21 Unconstrained TLMRI TLMRI algorithm requires setting error thresholds for variable sparsity update, and uses a penalty method-type approach. Alternative scheme employs an l 0 penalty instead, and uses the Augmented Lagrangian. (P5) Wˆx α 2 2 +λq(w)+ν F ux y 2 2 min x,w,{ˆx },{α },µ +η 2 α 0 + Re { µ H (R x ˆx ) } + τ R x ˆx µ is a Lagrange multiplier matrix with µ C n as columns. This is an unconstrained formulation, which is still non-convex. We solve it using the alternating direction method of multipliers (ADMM). We can group some terms together in (P5) and set µ = µ τ.

22 Algorithm for Unconstrained TLMRI Update of α uses simple hard thresholding of W ˆx with threshold level η. min W ˆx α 2 2 +η2 α 0 (6) {α } Update of W uses Closed-Form Solution. min W ˆx α 2 W 2 λ log detw +λ W 2 F (7) Update of {ˆx } involves a least squares problem in each ˆx. min Wˆx α 2 {ˆx } 2 + τ R x ˆx µ 2 (8) 2 2 Update of µ : µ = µ (R x ˆx ). Update of x done efficiently in k-space. τ min R x ˆx µ 2 x 2 +ν F 2 ux y 2 2 (9)

23 Unconstrained TLMRI - Cartesian 4x undersampling PSNR Unconstrained TLMRI DLMRI Iteration Number PSNR Unconstrained TLMRI DLMRI vs. Iterations Error Error (PSNR = db) (PSNR = db) 0 0 Our algorithm for Unconstrained TLMRI is better and 19x faster than DLMRI. Penalty approach performs similarly to unconstrained one with appropriate choice of error thresholds, but is slower.

24 Unconstrained TLMRI - 2D Random 5x undersampling TLMRI 4 recon. (PSNR = db) TLMRI Error 0 DLMRI recon. (PSNR = db) DLMRI Error Data from Miki Lustig, UC Berkeley. 4 12x Speedup over DLMRI.

25 Conclusions We proposed transform learning for undersampled MRI (TLMRI). Each step in our algorithms involves simple closed-form solutions. TLMRI provides comparable or better reconstructions than DLMRI. TLMRI is significantly faster than DLMRI. Unconstrained TLMRI algorithm faster than penalty-based approach. Speedups over DLMRI increase with patch size: >40x for 8 8 patches application to 3D/4D reconstruction. Iterates in our TLMRI algorithms empirically observed to converge. Future Work: Adaptive overcomplete transforms and doubly sparse transforms for MRI. Extension to dynamic MRI, functional MRI, etc.

Efficient Data-Driven Learning of Sparse Signal Models and Its Applications

Efficient Data-Driven Learning of Sparse Signal Models and Its Applications Efficient Data-Driven Learning of Sparse Signal Models and Its Applications Saiprasad Ravishankar Department of Electrical Engineering and Computer Science University of Michigan, Ann Arbor Dec 10, 2015

More information

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and the Coordinated Science Laboratory, University

More information

Learning Sparsifying Transforms

Learning Sparsifying Transforms 1 Learning Sparsifying Transforms Saiprasad Ravishankar, Student Member, IEEE, and Yoram Bresler, Fellow, IEEE Abstract The sparsity of signals and images in a certain transform domain or dictionary has

More information

EUSIPCO

EUSIPCO EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,

More information

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München, München, Arcistraße

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

694 IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, VOL. 3, NO. 4, DECEMBER 2017

694 IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, VOL. 3, NO. 4, DECEMBER 2017 694 IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, VOL. 3, NO. 4, DECEMBER 2017 Efficient Sum of Outer Products Dictionary Learning (SOUP-DIL) and Its Application to Inverse Problems Saiprasad Ravishankar,

More information

Bayesian Nonparametric Dictionary Learning for Compressed Sensing MRI

Bayesian Nonparametric Dictionary Learning for Compressed Sensing MRI 1 Bayesian Nonparametric Dictionary Learning for Compressed Sensing MRI Yue Huang, John Paisley, Qin Lin, Xinghao Ding, Xueyang Fu and Xiao-ping Zhang arxiv:1302.2712v2 [cs.cv] 9 Oct 2013 Abstract We develop

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

Applied Machine Learning for Biomedical Engineering. Enrico Grisan

Applied Machine Learning for Biomedical Engineering. Enrico Grisan Applied Machine Learning for Biomedical Engineering Enrico Grisan enrico.grisan@dei.unipd.it Data representation To find a representation that approximates elements of a signal class with a linear combination

More information

Overview. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Overview. Optimization-Based Data Analysis.   Carlos Fernandez-Granda Overview Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 1/25/2016 Sparsity Denoising Regression Inverse problems Low-rank models Matrix completion

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations

More information

2.3. Clustering or vector quantization 57

2.3. Clustering or vector quantization 57 Multivariate Statistics non-negative matrix factorisation and sparse dictionary learning The PCA decomposition is by construction optimal solution to argmin A R n q,h R q p X AH 2 2 under constraint :

More information

2 Regularized Image Reconstruction for Compressive Imaging and Beyond

2 Regularized Image Reconstruction for Compressive Imaging and Beyond EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement

More information

Structured matrix factorizations. Example: Eigenfaces

Structured matrix factorizations. Example: Eigenfaces Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

EE 381V: Large Scale Optimization Fall Lecture 24 April 11

EE 381V: Large Scale Optimization Fall Lecture 24 April 11 EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that

More information

Sparse & Redundant Signal Representation, and its Role in Image Processing

Sparse & Redundant Signal Representation, and its Role in Image Processing Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

A discretized Newton flow for time varying linear inverse problems

A discretized Newton flow for time varying linear inverse problems A discretized Newton flow for time varying linear inverse problems Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München Arcisstrasse

More information

Regularizing inverse problems using sparsity-based signal models

Regularizing inverse problems using sparsity-based signal models Regularizing inverse problems using sparsity-based signal models Jeffrey A. Fessler William L. Root Professor of EECS EECS Dept., BME Dept., Dept. of Radiology University of Michigan http://web.eecs.umich.edu/

More information

Compressed Sensing and Related Learning Problems

Compressed Sensing and Related Learning Problems Compressed Sensing and Related Learning Problems Yingzhen Li Dept. of Mathematics, Sun Yat-sen University Advisor: Prof. Haizhang Zhang Advisor: Prof. Haizhang Zhang 1 / Overview Overview Background Compressed

More information

Sparse linear models and denoising

Sparse linear models and denoising Lecture notes 4 February 22, 2016 Sparse linear models and denoising 1 Introduction 1.1 Definition and motivation Finding representations of signals that allow to process them more effectively is a central

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Machine Learning for Signal Processing Sparse and Overcomplete Representations Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

EFFICIENT LEARNING OF DICTIONARIES WITH LOW-RANK ATOMS

EFFICIENT LEARNING OF DICTIONARIES WITH LOW-RANK ATOMS EICIENT LEARNING O DICTIONARIES WITH LOW-RANK ATOMS Saiprasad Ravishankar, Brian E. Moore, Raj Rao Nadakuditi, and Jeffrey A. essler Department of Electrical Engineering and Computer Science, University

More information

Compressed Sensing via Partial l 1 Minimization

Compressed Sensing via Partial l 1 Minimization WORCESTER POLYTECHNIC INSTITUTE Compressed Sensing via Partial l 1 Minimization by Lu Zhong A thesis Submitted to the Faculty of the WORCESTER POLYTECHNIC INSTITUTE in partial fulfillment of the requirements

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Zhilin Zhang and Ritwik Giri Motivation Sparse Signal Recovery is an interesting

More information

Greedy Signal Recovery and Uniform Uncertainty Principles

Greedy Signal Recovery and Uniform Uncertainty Principles Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles

More information

Edinburgh Research Explorer

Edinburgh Research Explorer Edinburgh Research Explorer Fast Orthonormal Sparsifying Transforms Based on Householder Reflectors Citation for published version: Rusu, C, Gonzalez-Prelcic, N & Heath, R 2016, 'Fast Orthonormal Sparsifying

More information

Bayesian Paradigm. Maximum A Posteriori Estimation

Bayesian Paradigm. Maximum A Posteriori Estimation Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)

More information

Sparse Solutions of an Undetermined Linear System

Sparse Solutions of an Undetermined Linear System 1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

The Iteration-Tuned Dictionary for Sparse Representations

The Iteration-Tuned Dictionary for Sparse Representations The Iteration-Tuned Dictionary for Sparse Representations Joaquin Zepeda #1, Christine Guillemot #2, Ewa Kijak 3 # INRIA Centre Rennes - Bretagne Atlantique Campus de Beaulieu, 35042 Rennes Cedex, FRANCE

More information

SOS Boosting of Image Denoising Algorithms

SOS Boosting of Image Denoising Algorithms SOS Boosting of Image Denoising Algorithms Yaniv Romano and Michael Elad The Technion Israel Institute of technology Haifa 32000, Israel The research leading to these results has received funding from

More information

1 Sparsity and l 1 relaxation

1 Sparsity and l 1 relaxation 6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

On the coherence barrier and analogue problems in compressed sensing

On the coherence barrier and analogue problems in compressed sensing On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)

More information

Minimizing the Difference of L 1 and L 2 Norms with Applications

Minimizing the Difference of L 1 and L 2 Norms with Applications 1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:

More information

Greedy Dictionary Selection for Sparse Representation

Greedy Dictionary Selection for Sparse Representation Greedy Dictionary Selection for Sparse Representation Volkan Cevher Rice University volkan@rice.edu Andreas Krause Caltech krausea@caltech.edu Abstract We discuss how to construct a dictionary by selecting

More information

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Introduction to Sparsity Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Outline Understanding Sparsity Total variation Compressed sensing(definition) Exact recovery with sparse prior(l 0 ) l 1 relaxation

More information

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2

More information

Generalized Orthogonal Matching Pursuit- A Review and Some

Generalized Orthogonal Matching Pursuit- A Review and Some Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents

More information

Recent developments on sparse representation

Recent developments on sparse representation Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last

More information

SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS

SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS TR-IIS-4-002 SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS GUAN-JU PENG AND WEN-LIANG HWANG Feb. 24, 204 Technical Report No. TR-IIS-4-002 http://www.iis.sinica.edu.tw/page/library/techreport/tr204/tr4.html

More information

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Jarvis Haupt University of Minnesota Department of Electrical and Computer Engineering Supported by Motivation New Agile Sensing

More information

Provable Alternating Minimization Methods for Non-convex Optimization

Provable Alternating Minimization Methods for Non-convex Optimization Provable Alternating Minimization Methods for Non-convex Optimization Prateek Jain Microsoft Research, India Joint work with Praneeth Netrapalli, Sujay Sanghavi, Alekh Agarwal, Animashree Anandkumar, Rashish

More information

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).

More information

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Minru Bai(x T) College of Mathematics and Econometrics Hunan University Joint work with Xiongjun Zhang, Qianqian Shao June 30,

More information

Bayesian Methods for Sparse Signal Recovery

Bayesian Methods for Sparse Signal Recovery Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery

More information

Randomness-in-Structured Ensembles for Compressed Sensing of Images

Randomness-in-Structured Ensembles for Compressed Sensing of Images Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder

More information

Oslo Class 6 Sparsity based regularization

Oslo Class 6 Sparsity based regularization RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity

More information

Application to Hyperspectral Imaging

Application to Hyperspectral Imaging Compressed Sensing of Low Complexity High Dimensional Data Application to Hyperspectral Imaging Kévin Degraux PhD Student, ICTEAM institute Université catholique de Louvain, Belgium 6 November, 2013 Hyperspectral

More information

Super-resolution via Convex Programming

Super-resolution via Convex Programming Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation

More information

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer

More information

Computing Sparse Representation in a Highly Coherent Dictionary Based on Difference of L 1 and L 2

Computing Sparse Representation in a Highly Coherent Dictionary Based on Difference of L 1 and L 2 Computing Sparse Representation in a Highly Coherent Dictionary Based on Difference of L and L 2 Yifei Lou, Penghang Yin, Qi He and Jack Xin Abstract We study analytical and numerical properties of the

More information

Optimization for Compressed Sensing

Optimization for Compressed Sensing Optimization for Compressed Sensing Robert J. Vanderbei 2014 March 21 Dept. of Industrial & Systems Engineering University of Florida http://www.princeton.edu/ rvdb Lasso Regression The problem is to solve

More information

2D X-Ray Tomographic Reconstruction From Few Projections

2D X-Ray Tomographic Reconstruction From Few Projections 2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory

More information

Gauge optimization and duality

Gauge optimization and duality 1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel

More information

Non-convex Robust PCA: Provable Bounds

Non-convex Robust PCA: Provable Bounds Non-convex Robust PCA: Provable Bounds Anima Anandkumar U.C. Irvine Joint work with Praneeth Netrapalli, U.N. Niranjan, Prateek Jain and Sujay Sanghavi. Learning with Big Data High Dimensional Regime Missing

More information

Compressive Sensing Theory and L1-Related Optimization Algorithms

Compressive Sensing Theory and L1-Related Optimization Algorithms Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:

More information

A tutorial on sparse modeling. Outline:

A tutorial on sparse modeling. Outline: A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing

More information

Overcomplete Dictionaries for. Sparse Representation of Signals. Michal Aharon

Overcomplete Dictionaries for. Sparse Representation of Signals. Michal Aharon Overcomplete Dictionaries for Sparse Representation of Signals Michal Aharon ii Overcomplete Dictionaries for Sparse Representation of Signals Reasearch Thesis Submitted in Partial Fulfillment of The Requirements

More information

Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning

Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning Xian Wei, Martin Kleinsteuber, and Hao Shen Department of Electrical and Computer Engineering Technische Universität München,

More information

LEARNING DATA TRIAGE: LINEAR DECODING WORKS FOR COMPRESSIVE MRI. Yen-Huan Li and Volkan Cevher

LEARNING DATA TRIAGE: LINEAR DECODING WORKS FOR COMPRESSIVE MRI. Yen-Huan Li and Volkan Cevher LARNING DATA TRIAG: LINAR DCODING WORKS FOR COMPRSSIV MRI Yen-Huan Li and Volkan Cevher Laboratory for Information Inference Systems École Polytechnique Fédérale de Lausanne ABSTRACT The standard approach

More information

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net

More information

Algorithms for sparse analysis Lecture I: Background on sparse approximation

Algorithms for sparse analysis Lecture I: Background on sparse approximation Algorithms for sparse analysis Lecture I: Background on sparse approximation Anna C. Gilbert Department of Mathematics University of Michigan Tutorial on sparse approximations and algorithms Compress data

More information

MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco

MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications Class 08: Sparsity Based Regularization Lorenzo Rosasco Learning algorithms so far ERM + explicit l 2 penalty 1 min w R d n n l(y

More information

Sparse analysis Lecture III: Dictionary geometry and greedy algorithms

Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Anna C. Gilbert Department of Mathematics University of Michigan Intuition from ONB Key step in algorithm: r, ϕ j = x c i ϕ i, ϕ j

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

A Quest for a Universal Model for Signals: From Sparsity to ConvNets

A Quest for a Universal Model for Signals: From Sparsity to ConvNets A Quest for a Universal Model for Signals: From Sparsity to ConvNets Yaniv Romano The Electrical Engineering Department Technion Israel Institute of technology Joint work with Vardan Papyan Jeremias Sulam

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

Seismic data interpolation and denoising by learning a tensor tight frame

Seismic data interpolation and denoising by learning a tensor tight frame Seismic data interpolation and denoising by learning a tensor tight frame Lina Liu 1, Gerlind Plonka Jianwei Ma 1 1,Department of Mathematics,Harbin Institute of Technology, Harbin, China,Institute for

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1239 Preconditioning for Underdetermined Linear Systems with Sparse Solutions Evaggelia Tsiligianni, StudentMember,IEEE, Lisimachos P. Kondi,

More information

ABSTRACT. Recovering Data with Group Sparsity by Alternating Direction Methods. Wei Deng

ABSTRACT. Recovering Data with Group Sparsity by Alternating Direction Methods. Wei Deng ABSTRACT Recovering Data with Group Sparsity by Alternating Direction Methods by Wei Deng Group sparsity reveals underlying sparsity patterns and contains rich structural information in data. Hence, exploiting

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Particle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences

Particle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences Particle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences Samarjit Das and Namrata Vaswani Department of Electrical and Computer Engineering Iowa State University http://www.ece.iastate.edu/

More information

Compressed sensing and imaging

Compressed sensing and imaging Compressed sensing and imaging The effect and benefits of local structure Ben Adcock Department of Mathematics Simon Fraser University 1 / 45 Overview Previously: An introduction to compressed sensing.

More information

Blind Compressed Sensing

Blind Compressed Sensing 1 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE arxiv:1002.2586v2 [cs.it] 28 Apr 2010 Abstract The fundamental principle underlying compressed sensing is that a signal,

More information

Low-Complexity FPGA Implementation of Compressive Sensing Reconstruction

Low-Complexity FPGA Implementation of Compressive Sensing Reconstruction 2013 International Conference on Computing, Networking and Communications, Multimedia Computing and Communications Symposium Low-Complexity FPGA Implementation of Compressive Sensing Reconstruction Jerome

More information

Mathematical introduction to Compressed Sensing

Mathematical introduction to Compressed Sensing Mathematical introduction to Compressed Sensing Lesson 1 : measurements and sparsity Guillaume Lecué ENSAE Mardi 31 janvier 2016 Guillaume Lecué (ENSAE) Compressed Sensing Mardi 31 janvier 2016 1 / 31

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

Sparse Approximation and Variable Selection

Sparse Approximation and Variable Selection Sparse Approximation and Variable Selection Lorenzo Rosasco 9.520 Class 07 February 26, 2007 About this class Goal To introduce the problem of variable selection, discuss its connection to sparse approximation

More information

CSC 576: Variants of Sparse Learning

CSC 576: Variants of Sparse Learning CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in

More information

1 Computing with constraints

1 Computing with constraints Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

Beyond incoherence and beyond sparsity: compressed sensing in the real world

Beyond incoherence and beyond sparsity: compressed sensing in the real world Beyond incoherence and beyond sparsity: compressed sensing in the real world Clarice Poon 1st November 2013 University of Cambridge, UK Applied Functional and Harmonic Analysis Group Head of Group Anders

More information

Lecture Notes 5: Multiresolution Analysis

Lecture Notes 5: Multiresolution Analysis Optimization-based data analysis Fall 2017 Lecture Notes 5: Multiresolution Analysis 1 Frames A frame is a generalization of an orthonormal basis. The inner products between the vectors in a frame and

More information

A Sparsity Enforcing Framework with TVL1 Regularization and. its Application in MR Imaging and Source Localization. Wei Shen

A Sparsity Enforcing Framework with TVL1 Regularization and. its Application in MR Imaging and Source Localization. Wei Shen A Sparsity Enforcing Framework with TVL1 Regularization and its Application in MR Imaging and Source Localization by Wei Shen A Dissertation Presented in Partial Fulfillment of the Requirements for the

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

Tensor-Based Dictionary Learning for Multidimensional Sparse Recovery. Florian Römer and Giovanni Del Galdo

Tensor-Based Dictionary Learning for Multidimensional Sparse Recovery. Florian Römer and Giovanni Del Galdo Tensor-Based Dictionary Learning for Multidimensional Sparse Recovery Florian Römer and Giovanni Del Galdo 2nd CoSeRa, Bonn, 17-19 Sept. 2013 Ilmenau University of Technology Institute for Information

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

Topographic Dictionary Learning with Structured Sparsity

Topographic Dictionary Learning with Structured Sparsity Topographic Dictionary Learning with Structured Sparsity Julien Mairal 1 Rodolphe Jenatton 2 Guillaume Obozinski 2 Francis Bach 2 1 UC Berkeley 2 INRIA - SIERRA Project-Team San Diego, Wavelets and Sparsity

More information