Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

Size: px
Start display at page:

Download "Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector"

Transcription

1 Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer Engineering, Georgia Institute of Technology.

2 Measurement model Problem statement y = Ax + e x : n dimensional unknown signal. y : m dimensional measurement vector. A : m n measurement matrix. e : error vector. The Dantzig selector: A robust estimator for recovery of sparse signals from linear measurements. DS : minimize k xk 1 subject to ka T (A x y)k ², for some ²>0. This is convex and can be recast into an LP. Primal Dual pursuit: a homotopy approach to solve the Dantzig selector.

3 Outline Background Sparse representation Compressed sensing Introduction l 1 minimization Random sensing l 1 estimators Primal dual pursuit Dantzig selector primal dual formulation Primal-dual homotopy Main algorithm Primal and dual update Update directions Numerical implementation S-step solution property Optimality conditions S-step solution property analysis Random matrices Incoherent ensemble Simulation results Conclusion & future work

4 Sparse Representation Signal/image x(t) in time/spatial domain can be represented in some basis ψ as x(t) = X i α i ψ i or x = Ψα ψ i = basis function α i = expansion coefficients in ψ domain e.g., sinusoids, wavelets, curvelets,... Most signals of interest can be well-represented by a small number of transform coefficients in some appropriate basis. Magnitude of the transform coefficients decay rapidly, usually following some power law α (k) k r, for some r>0

5 Benefits of sparsity Sparsity plays an important role in many signal processing applications such as Signal estimating in the presence of noise (thresholding) Inverse problems for signal reconstruction (tomography) Compression (transform coding) Signal acquisition (compressed sensing) Best K-term approximation with 25k wavelet coeffs

6 Data acquisition (Sampling) Compressed Sensing Shannon-Nyquist sampling theorem Exact reconstruction of a band-limited signal possible if sampling rate is atleast twice the maximum frequency. Reconstruction phenomenon is linear. Compression (Transform coding) Transform signal/image in some suitable basis e.g., wavelets or discretecosine(dct) Select few best coefficients without causing much perceptual loss. Transfrom back to the canonical basis. Sample a signal at or above Nyquist rate, transform into some sparsifying basis, adaptively encode a small portion out of it and throw away the rest. Seemsverywasteful! Is it possible to recover a signal by acquiring only as many nonadaptive samples as we will be keeping eventually? Answer is YES and lies in compressed sensing!

7 Compressed Sensing entia non sunt multiplicanda praeter necessitatem -- entities must not be multiplied beyond necessity. -Occam s Razor (wiki) Consider projecting the points of your favorite sculpture first onto a plane and then onto a single line. This is the power of dimensionality reduction! [Achlioptas 2001]

8 Compressed Sensing Model Take m linear measurements of n dimensional signal x y 1 = hx, ϕ 1 i,y 2 = hx, ϕ 2 i,... y m = hx, ϕ m i or y = Φx Generalized sampling/sensing, equivalent to sampling in transform domain ϕ. Callϕ k - sensing functions. Choice of ϕ k gives flexibility in acquisition Dirac delta functions : conventional sampling. Block indicator functions : pixels values collected from CCD arrays. Sinusoids : Fourier measurements in MRI. Compressed sensing model y = Φx where m n. y = Φ m n x Φ is the measurement/sensing matrix.

9 Signal Reconstruction Compressed sensing model y = Φx where m n. y = Φ m n x Underdetermined system : Impossibletosolveingeneral! Infinetely many possible solutions on the hyperplane H := {ˆx : Φˆx = y} = N (Φ)+x However situation is different if x is sufficiently sparse and Φ obeys some incoherence properties. Combinatorial search : Find sparsest vector in hyperplane H. P 0 : minimize k xk 0 subject to Φ x = y Combinatorial optimization problem, known to be NP-hard.

10 minimum `1 norm Convex Relaxation P 1 : minimize k xk 1 subject to Φ x = y Can be recast into an LP for real data and SOCP for complex data. Geometry of `1 minimization

11 Uniform Uncertainty Principle Uniform uncertainty principle (UUP) or Restricted Isometry Property (1 δ S )kck 2 2 kφ T ck 2 2 (1 + δ S)kck 2 2 (RIP) δ S : S-restricted isometry constant. Φ T : columns of Φ indexed by set T. UUP requires Φ to obey (RIP) for every subset of columns T and coefficient sequence {c j } j T such that T S. This essentially tells that every subset of columns with cardinality less than S behaves almost like an orthonormal system. An equivalent form of Uniform uncertainty principle 1 2 m n kck2 2 kφ T ck m n kck2 2 where columns of Φ are not normalized, δ S = 1 2.

12 Uniform Uncertainty Principle Uniform uncertainty principle (UUP) or Restricted Isometry Property (1 δ S )kck 2 2 kφ T ck 2 2 (1 + δ S)kck 2 2 (RIP) δ S : S-restricted isometry constant. Φ T : columns of Φ indexed by set T. UUP requires Φ to obey (RIP) for every subset of columns T and coefficient sequence {c j } j T such that T S. This essentially tells that every subset of columns with cardinality less than S behaves almost like an orthonormal system. Matrices which obey UUP Gaussian matrix Bernoulli matrix Partial Fourier matrix Incoherent ensemble m & S log n m & S log n m & S log 5 n m & µ 2 S log 5 n

13 Stable Recovery Compressed sensing model in the presence of noise y = Φx + e, `1 minimization with data fidelity constraints Quadratic constraints minimize k xk 1 subject to kφ x yk 2 ² Also known as Lasso in statistics community. Bounded residual correlation : The Dantzig selector minimize k xk 1 subject to kφ (Φ x y)k ², where ² is usually chosen close to 2logn σ, assuming that entries in e are i.i.d. N (0, σ 2 ).

14 The Dantzig selector Dantzig selector minimize k xk 1 subject to kφ (Φ x y)k ², where ² is usually chosen close to 2logn σ, assuming that entries in e are i.i.d. N (0, σ 2 ). If columns of Φ are unit normed then solution x to DS obeys Ã! nx kx x k 2 2 C 2logn σ 2 + min(x 2 i, σ 2 ), Soft thresholding (ideal denoising) : estimate x from noisy samples y = x + e i=1 minimize k xk 1 subject to ky xk λ σ.

15 Primal Dual Pursuit

16 Dantzig Selector Primal and Dual System model y = Ax + e x R n : unknown signal y R m : measurement vector A R m n : measurement matrix e R m : noise vector. Dantzig selector (Primal) minimize k xk 1 subject to ka T (A x y)k ², Dantzig selector (Dual) maximize (²kλk 1 + hλ,a T yi) subject to ka T Aλk 1 where λ R n is the dual vector.

17 Dantzig Selector Primal and Dual Strong duality tells us that at any primal-dual solution pair (x, λ ) corresponding to certain ² kx k 1 = (²kλ k 1 + hλ,a T yi) or equivalently kx k 1 + ²kλ k 1 = hx,a T Aλ i + hλ,a T (Ax y)i. (1) Using (1), the complementary slackness and feasibility conditions for primal and dual problems we get the following four optimality conditions for the solution pair (x, λ ) K1. A T Γ λ (Ax y) =²z λ K2. A T Γ x Aλ = z x K3. a T γ (Ax y) <² for all γ Γ c λ K4. a T γ Aλ < 1 for all γ Γ c x

18 Primal Dual Pursuit Homotopy Principle: Start from an artificial initial value and iteratively move towards the desired solution by gradually adjusting the homotopy parameter(s). In our formulation homotopy parameter is ². Followthepathtracedbysequenceofprimal-dualsolutionpairs (x k, λ k ) for a range of ² k as ² k ², and consequently x k x. (x d,² d ) x d λ d (x 2,² 2 ) (x,²) (x 0,² 0 ) x 1 x 0 λ 1 λ 0 (x 1,² 1 )

19 Primal Dual Pursuit Active primal constraints give us sign and support of dual vector λ Active dual constraints give us sign and support of primal vector x A T Γ λ (Ax k y) =² k z λ A T Γ x Aλ k = z x And we keep track of supports Γ x, Γ λ and sign sequences z x,z λ all the time. Start at sufficiently large ² k such that x k =0and only one dual constraint is active (use ² k = ka T yk ). Move in the direction which reduces ² by most, until there is some change in the supports. Update the supports and sign sequences for primal and dual vectors andfindnewdirectiontomovein.

20 Primal Dual Pursuit (x d,² d ) x d λ d (x 2,² 2 ) (x,²) x 1 x 0 (x 0,² 0 ) (x 1,² 1 ) λ 0 The homotopy path for x k is piecewise linear and the kinks in this path represent some critical values of ² k where primal and/or dual supports change. Either a new element enters the support or an element from within the support shrinks to zero. At any instant, the optimality conditions (K1-K4) must be obeyed by the primal-dual solution pair (x k, λ k ). λ 1

21 Primal Dual Pursuit At every kth step we have primal-dual vectors (x k, λ k ), respective support (Γ x, Γ λ ) and sign sequence (z x,z λ ). Wecandivideeachstepintotwomainparts Primal update: Compute update direction x andsmalleststepsize δ such that either a new element enters Γ λ or an existing element leaves Γ x. Dual update: Compute update direction λ and smallest step size θ such that either a new element enters Γ x or an existing element leaves Γ λ. Set x k+1 = x k + δ x λ k+1 = λ k + θ λ and update primal-dual supports and sign sequences.

22 Primal Update a T γ (Ax k+1 y) = ² k+1 for all γ Γ λ a T γ (Ax k+1 y) ² k+1 for all γ Γ c λ a T γ (Ax y) {z } p k (γ) +δ a T γ A x ² k δ for all γ Γ c λ {z } d k (γ) Primal constraints p k (γ)+δd k (γ) ² k δ for all γ Γ c λ µ δ + ²k p k (i) = min i Γ c λ 1+d k (i), ² k + p k (i) 1 d k (i) µ i + ²k p k (i) = arg min i Γ c 1+d λ k (i), ² k + p k (i) 1 d k (i) µ δ = min x k(i) i Γ x x(i) i = arg min i Γ x µ x k(i) x(i) δ = min(δ +, δ ) New constraints-1 getting active as -1.5 ² k reduces an element from within Γ x shrinks to zero

23 a T ν Aλ k {z } a k (ν) Dual Update a T ν Aλ k+1 =1 forallν Γ x a T ν Aλ k+1 1 for all ν Γ c x +θ a T ν {z A λ 1 for all ν Γ c x } b k (ν) a k (ν)+θb k (ν)) 1 for all ν Γ c x µ 1 θ + ak (j) = min, 1+a k(j) j Γ c x b k (j) b k (j) µ 1 j + ak (j) = arg min, 1+a k(j) j Γ c b x k (j) b k (j) µ λ(j) θ = min j Γ λ λ(j) j = arg min j Γ λ µ λ(j) λ(j) θ = min(θ +, θ )

24 Update Directions Primal update direction ( (A T Γ x = λ A Γx ) 1 z λ on Γ x 0 elsewhere Dual update direction z γ (A T Γ x A Γλ ) 1 A T Γ x a γ on Γ λ λ = z γ on γ 0 elsewhere Why? A T Γ x A(λ + θ λ) = z x A T Γ x A Γλ λ + A T Γx a γ z γ =0.

25 Primal Dual Pursuit Algorithm

26 Primal Dual Pursuit Algorithm

27 Numerical Implementation Main computational cost Update direction ( x, λ). Step size (δ, θ). No need to solve a new system at every step. Just update the most recent inverse matrix whenever supports change. Matrix inversion lemma. Rank one update. Ã T B = A T a T B b = AT B a T B A T b a T b =:. A 21 A 22 A11 A 12

28 Numerical Implementation Just update the most recent inverse matrix whenever supports change. Matrix inversion lemma. Rank one update. 1 A11 A 12 = A 21 A 22 A A 1 11 A 12S 1 A 21 A 1 11 A 1 11 A 12S 1 S 1 A 21 A 1 11 S 1, where S = A 22 A 21 A 1 11 A 12 is the Schur complement of A A11 A 12 Q11 Q = 12 A 21 A 22, Q 21 Q 22 A 1 11 = Q 11 Q 12 Q 1 22 Q 21. Computational cost for one step is just few matrix vector multiplications.

29 S-step Solution S-sparse signal can be recovered in S primal-dual step! Random measurements y = Ax 0 m & S 2 log n Gaussian entries of A independently selected to be i.i.d. Gaussian N (0, 1/m). Bernoulli entries of A independently selected to be ±1/ m with equal probability Incoherent measurements S 1 2 µ 1+ 1 M where M = max i6=j ha i,a j i

30 Optimality Condition K1. A T Γ λ (Ax y) =²z λ K2. A T Γ x Aλ = z x K3. ka T Γ c (Ax y)k <² λ K4. ka T Γ c x Aλ k < 1 for all (x ², λ ) isasolutionpair 0 ² ² crit := min γ Γ µ x 0(γ) λ(γ) Γ := Γ x z := z x λ = ( (A T Γ A Γ) 1 z on Γ 0 elsewhere λ = ˆ x H1. A Γ is full rank. H2. ka T Γ ca Γ(A T Γ A Γ ) 1 zk < 1 H3. sign[(a T Γ A Γ ) 1 z]=z set x ² = x 0 + ²λ Γ λ = Γ x z λ = z x

31 S step Solution (x S,² S ) (x S 2,² S 2 ) λ S 2 λ S 1 λ 0 (x 1,² 1 ) (x S 1,² S 1 ) x d = λ d (x 0, 0) λ 1 (x d,² d ) Trace the path backwards, starting from exact solution x 0. S step solution property holds if x S =0. Means all the elements are removed from the support in S steps. Only if conditions (H1-H3) hold at every step.

32 Dantzig Shrinkability 1. k =0, Γ 0 = supp x 0,andz 0 = sign(x 0 Γ0 ). 2. If x k =0,returnSuccess. 3. Check that ka T Γ c A Γ k (A T k Γ k A Γk ) 1 zk < 1 sign[(a T Γ k A Γk ) 1 z]=z If either condition fails, break and return Failure. ½ (A T 4. Set λ k = Γk A Γk ) 1 z k on Γ k 0 on Γ ³ c k ² k+1 =min xk (γ) γ Γ λ k, (γ) k x k+1 = x k + ² k+1 ³ λ k, γk+1 0 = arg min xk (γ) λ k, (γ) γ Γ k Γ k+1 = Γ k \γk+1 0, z k+1 = z k restricted to Γ k Set k k +1, and return to step 2.,

33 Sufficient Conditions for S step Solution A Γ be full rank. (H1) Let G = I A T Γ A Γ, then (H2-H3) will be satisfied if kgk < 1 and max γ {1,...,n} h(at Γ A Γ) 1 Y γ,zi < 1, (2) with Y γ = ( A T Γ a γ γ Γ c A T Γ a γ 1 γ γ Γ,

34 Condition H3! If kgk < 1, wecanwrite(a T Γ A Γ) 1 z in the following way Ã! X X (A T ΓA Γ ) 1 z =(I G) 1 z = G`z = z + G`z, `=0 condition (H3) will be satisfied if X G`z < 1. max γ Γ h1 γ, `=1 `=1 X G`zi =max X G`1 γ Γ h γ,zi `=1 X =max γ Γ h G` 1 g γ,zi `=1 =max h(a T ΓA Γ ) 1 g γ,zi γ Γ `=1 Neumann series

35 Outline of Proof for S-step S property Bound the norm of Y γ for all γ {1,...,n} Bound the norm of w γ := (A T Γ AT Γ ) 1 Y γ Use Cauchy-Schwarz inequality to satisfy hw γ,zi < 1 for all γ Random Matrices : Gaussian or Bernoulli q log n Each entry of Y γ is bounded by C β m q 1 O(n β ), for some constant β > 0. So ky γ k <C β same probability. with probability exceeding S log n m with Uniform uncertainty principle tells us that k(a T Γ A Γ) 1 k < 2 with overwhelming high probability. Using Cauchy-Schwarz inequality, (2) is satisfied with probability exceeding 1 O(n β ) if m C β S 2 log n

36 Experimental Results (Gaussian)

37 Experimental Results (Ortho-Gaussian)

38 Lasso and Dantzig Selector Lasso minimize x 1 2 ky A xk2 2 + ²k xk 1 (Lasso) Optimality conditions L1. A T Γ (Ax y) = ²z L2. a T γ (Ax y) <² for all γ Γ c x Lasso Γ =(A T Γ A Γ) 1 z (Lasso update) x DS Γx = (A T Γ λ A Γx ) 1 z λ (DS update)

39 Future Work Better bound on required number of measurements! S 2 log n? S log α n, for some small α > 0. Investigate the effect of orthogonal rows in the S-step recovery. Dynamic update of measurements. Implementation for largescale problems.

40 Questions

41 Thankyou! Muhammad Salman Asif

PRIMAL DUAL PURSUIT A HOMOTOPY BASED ALGORITHM FOR THE DANTZIG SELECTOR

PRIMAL DUAL PURSUIT A HOMOTOPY BASED ALGORITHM FOR THE DANTZIG SELECTOR PRIMAL DUAL PURSUIT A HOMOTOPY BASED ALGORITHM FOR THE DANTZIG SELECTOR A Thesis Presented to The Academic Faculty by Muhammad Salman Asif In Partial Fulfillment of the Requirements for the Degree Master

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Constrained optimization

Constrained optimization Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Generalized Orthogonal Matching Pursuit- A Review and Some

Generalized Orthogonal Matching Pursuit- A Review and Some Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

Analog-to-Information Conversion

Analog-to-Information Conversion Analog-to-Information Conversion Sergiy A. Vorobyov Dept. Signal Processing and Acoustics, Aalto University February 2013 Winter School on Compressed Sensing, Ruka 1/55 Outline 1 Compressed Sampling (CS)

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Near Optimal Signal Recovery from Random Projections

Near Optimal Signal Recovery from Random Projections 1 Near Optimal Signal Recovery from Random Projections Emmanuel Candès, California Institute of Technology Multiscale Geometric Analysis in High Dimensions: Workshop # 2 IPAM, UCLA, October 2004 Collaborators:

More information

Sparse Optimization Lecture: Dual Certificate in l 1 Minimization

Sparse Optimization Lecture: Dual Certificate in l 1 Minimization Sparse Optimization Lecture: Dual Certificate in l 1 Minimization Instructor: Wotao Yin July 2013 Note scriber: Zheng Sun Those who complete this lecture will know what is a dual certificate for l 1 minimization

More information

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016 Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

A Survey of Compressive Sensing and Applications

A Survey of Compressive Sensing and Applications A Survey of Compressive Sensing and Applications Justin Romberg Georgia Tech, School of ECE ENS Winter School January 10, 2012 Lyon, France Signal processing trends DSP: sample first, ask questions later

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

Sparsity in Underdetermined Systems

Sparsity in Underdetermined Systems Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

AN INTRODUCTION TO COMPRESSIVE SENSING

AN INTRODUCTION TO COMPRESSIVE SENSING AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE

More information

19.1 Problem setup: Sparse linear regression

19.1 Problem setup: Sparse linear regression ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 19: Minimax rates for sparse linear regression Lecturer: Yihong Wu Scribe: Subhadeep Paul, April 13/14, 2016 In

More information

Sparse Legendre expansions via l 1 minimization

Sparse Legendre expansions via l 1 minimization Sparse Legendre expansions via l 1 minimization Rachel Ward, Courant Institute, NYU Joint work with Holger Rauhut, Hausdorff Center for Mathematics, Bonn, Germany. June 8, 2010 Outline Sparse recovery

More information

Convex Optimization and l 1 -minimization

Convex Optimization and l 1 -minimization Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Sparse Recovery of Streaming Signals Using. M. Salman Asif and Justin Romberg. Abstract

Sparse Recovery of Streaming Signals Using. M. Salman Asif and Justin Romberg. Abstract Sparse Recovery of Streaming Signals Using l 1 -Homotopy 1 M. Salman Asif and Justin Romberg arxiv:136.3331v1 [cs.it] 14 Jun 213 Abstract Most of the existing methods for sparse signal recovery assume

More information

Sparse Optimization Lecture: Sparse Recovery Guarantees

Sparse Optimization Lecture: Sparse Recovery Guarantees Those who complete this lecture will know Sparse Optimization Lecture: Sparse Recovery Guarantees Sparse Optimization Lecture: Sparse Recovery Guarantees Instructor: Wotao Yin Department of Mathematics,

More information

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar

More information

Compressed Sensing: Lecture I. Ronald DeVore

Compressed Sensing: Lecture I. Ronald DeVore Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function

More information

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University

More information

Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming)

Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming) Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming) Justin Romberg Georgia Tech, ECE Caltech ROM-GR Workshop June 7, 2013 Pasadena, California Linear

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Noisy Signal Recovery via Iterative Reweighted L1-Minimization Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.

More information

Phase Transition Phenomenon in Sparse Approximation

Phase Transition Phenomenon in Sparse Approximation Phase Transition Phenomenon in Sparse Approximation University of Utah/Edinburgh L1 Approximation: May 17 st 2008 Convex polytopes Counting faces Sparse Representations via l 1 Regularization Underdetermined

More information

Signal Recovery from Permuted Observations

Signal Recovery from Permuted Observations EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,

More information

COMPRESSED SENSING IN PYTHON

COMPRESSED SENSING IN PYTHON COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

Gauge optimization and duality

Gauge optimization and duality 1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel

More information

Algorithms for sparse analysis Lecture I: Background on sparse approximation

Algorithms for sparse analysis Lecture I: Background on sparse approximation Algorithms for sparse analysis Lecture I: Background on sparse approximation Anna C. Gilbert Department of Mathematics University of Michigan Tutorial on sparse approximations and algorithms Compress data

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems

An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems Justin Romberg Georgia Tech, School of ECE ENS Winter School January 9, 2012 Lyon, France Applied and Computational

More information

Towards a Mathematical Theory of Super-resolution

Towards a Mathematical Theory of Super-resolution Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This

More information

Sparsity and Compressed Sensing

Sparsity and Compressed Sensing Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A

More information

Randomness-in-Structured Ensembles for Compressed Sensing of Images

Randomness-in-Structured Ensembles for Compressed Sensing of Images Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder

More information

1 Regression with High Dimensional Data

1 Regression with High Dimensional Data 6.883 Learning with Combinatorial Structure ote for Lecture 11 Instructor: Prof. Stefanie Jegelka Scribe: Xuhong Zhang 1 Regression with High Dimensional Data Consider the following regression problem:

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Greedy Signal Recovery and Uniform Uncertainty Principles

Greedy Signal Recovery and Uniform Uncertainty Principles Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles

More information

L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise

L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise Srdjan Stanković, Irena Orović and Moeness Amin 1 Abstract- A modification of standard

More information

Stable Signal Recovery from Incomplete and Inaccurate Measurements

Stable Signal Recovery from Incomplete and Inaccurate Measurements Stable Signal Recovery from Incomplete and Inaccurate Measurements EMMANUEL J. CANDÈS California Institute of Technology JUSTIN K. ROMBERG California Institute of Technology AND TERENCE TAO University

More information

THEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS

THEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS THEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS YIN ZHANG Abstract. Compressive sensing (CS) is an emerging methodology in computational signal processing that has

More information

Part IV Compressed Sensing

Part IV Compressed Sensing Aisenstadt Chair Course CRM September 2009 Part IV Compressed Sensing Stéphane Mallat Centre de Mathématiques Appliquées Ecole Polytechnique Conclusion to Super-Resolution Sparse super-resolution is sometime

More information

MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco

MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications Class 08: Sparsity Based Regularization Lorenzo Rosasco Learning algorithms so far ERM + explicit l 2 penalty 1 min w R d n n l(y

More information

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,

More information

Optimization for Compressed Sensing

Optimization for Compressed Sensing Optimization for Compressed Sensing Robert J. Vanderbei 2014 March 21 Dept. of Industrial & Systems Engineering University of Florida http://www.princeton.edu/ rvdb Lasso Regression The problem is to solve

More information

Sparsifying Transform Learning for Compressed Sensing MRI

Sparsifying Transform Learning for Compressed Sensing MRI Sparsifying Transform Learning for Compressed Sensing MRI Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and Coordinated Science Laborarory University of Illinois

More information

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net

More information

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France. Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems

More information

2 Regularized Image Reconstruction for Compressive Imaging and Beyond

2 Regularized Image Reconstruction for Compressive Imaging and Beyond EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement

More information

Lecture 3. Random Fourier measurements

Lecture 3. Random Fourier measurements Lecture 3. Random Fourier measurements 1 Sampling from Fourier matrices 2 Law of Large Numbers and its operator-valued versions 3 Frames. Rudelson s Selection Theorem Sampling from Fourier matrices Our

More information

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell

More information

Oslo Class 6 Sparsity based regularization

Oslo Class 6 Sparsity based regularization RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity

More information

Toeplitz Compressed Sensing Matrices with. Applications to Sparse Channel Estimation

Toeplitz Compressed Sensing Matrices with. Applications to Sparse Channel Estimation Toeplitz Compressed Sensing Matrices with 1 Applications to Sparse Channel Estimation Jarvis Haupt, Waheed U. Bajwa, Gil Raz, and Robert Nowak Abstract Compressed sensing CS has recently emerged as a powerful

More information

Minimizing the Difference of L 1 and L 2 Norms with Applications

Minimizing the Difference of L 1 and L 2 Norms with Applications 1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:

More information

Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements

Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements Wolfgang Dahmen Institut für Geometrie und Praktische Mathematik RWTH Aachen and IMI, University of Columbia, SC

More information

The Sparsest Solution of Underdetermined Linear System by l q minimization for 0 < q 1

The Sparsest Solution of Underdetermined Linear System by l q minimization for 0 < q 1 The Sparsest Solution of Underdetermined Linear System by l q minimization for 0 < q 1 Simon Foucart Department of Mathematics Vanderbilt University Nashville, TN 3784. Ming-Jun Lai Department of Mathematics,

More information

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional

More information

Compressive Sensing Theory and L1-Related Optimization Algorithms

Compressive Sensing Theory and L1-Related Optimization Algorithms Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

Low-Dimensional Signal Models in Compressive Sensing

Low-Dimensional Signal Models in Compressive Sensing University of Colorado, Boulder CU Scholar Electrical, Computer & Energy Engineering Graduate Theses & Dissertations Electrical, Computer & Energy Engineering Spring 4-1-2013 Low-Dimensional Signal Models

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via convex relaxations Yuejie Chi Department of Electrical and Computer Engineering Spring

More information

The convex algebraic geometry of rank minimization

The convex algebraic geometry of rank minimization The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming

More information

Statistical Issues in Searches: Photon Science Response. Rebecca Willett, Duke University

Statistical Issues in Searches: Photon Science Response. Rebecca Willett, Duke University Statistical Issues in Searches: Photon Science Response Rebecca Willett, Duke University 1 / 34 Photon science seen this week Applications tomography ptychography nanochrystalography coherent diffraction

More information

Rui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China

Rui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China Acta Mathematica Sinica, English Series May, 015, Vol. 31, No. 5, pp. 755 766 Published online: April 15, 015 DOI: 10.1007/s10114-015-434-4 Http://www.ActaMath.com Acta Mathematica Sinica, English Series

More information

Sparse and Robust Optimization and Applications

Sparse and Robust Optimization and Applications Sparse and and Statistical Learning Workshop Les Houches, 2013 Robust Laurent El Ghaoui with Mert Pilanci, Anh Pham EECS Dept., UC Berkeley January 7, 2013 1 / 36 Outline Sparse Sparse Sparse Probability

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 57 Table of Contents 1 Sparse linear models Basis Pursuit and restricted null space property Sufficient conditions for RNS 2 / 57

More information

Sparse recovery for spherical harmonic expansions

Sparse recovery for spherical harmonic expansions Rachel Ward 1 1 Courant Institute, New York University Workshop Sparsity and Cosmology, Nice May 31, 2011 Cosmic Microwave Background Radiation (CMB) map Temperature is measured as T (θ, ϕ) = k k=0 l=

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

A New Estimate of Restricted Isometry Constants for Sparse Solutions

A New Estimate of Restricted Isometry Constants for Sparse Solutions A New Estimate of Restricted Isometry Constants for Sparse Solutions Ming-Jun Lai and Louis Y. Liu January 12, 211 Abstract We show that as long as the restricted isometry constant δ 2k < 1/2, there exist

More information

Compressive sensing of low-complexity signals: theory, algorithms and extensions

Compressive sensing of low-complexity signals: theory, algorithms and extensions Compressive sensing of low-complexity signals: theory, algorithms and extensions Laurent Jacques March 7, 9, 1, 14, 16 and 18, 216 9h3-12h3 (incl. 3 ) Graduate School in Systems, Optimization, Control

More information

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5 CS 229r: Algorithms for Big Data Fall 215 Prof. Jelani Nelson Lecture 19 Nov 5 Scribe: Abdul Wasay 1 Overview In the last lecture, we started discussing the problem of compressed sensing where we are given

More information

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION A Thesis by MELTEM APAYDIN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

ABSTRACT. Recovering Data with Group Sparsity by Alternating Direction Methods. Wei Deng

ABSTRACT. Recovering Data with Group Sparsity by Alternating Direction Methods. Wei Deng ABSTRACT Recovering Data with Group Sparsity by Alternating Direction Methods by Wei Deng Group sparsity reveals underlying sparsity patterns and contains rich structural information in data. Hence, exploiting

More information

People Hearing Without Listening: An Introduction To Compressive Sampling

People Hearing Without Listening: An Introduction To Compressive Sampling People Hearing Without Listening: An Introduction To Compressive Sampling Emmanuel J. Candès and Michael B. Wakin Applied and Computational Mathematics California Institute of Technology, Pasadena CA 91125

More information

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Jeevan K. Pant, Wu-Sheng Lu, and Andreas Antoniou University of Victoria May 21, 2012 Compressive Sensing 1/23

More information

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28 Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:

More information

Constructing Explicit RIP Matrices and the Square-Root Bottleneck

Constructing Explicit RIP Matrices and the Square-Root Bottleneck Constructing Explicit RIP Matrices and the Square-Root Bottleneck Ryan Cinoman July 18, 2018 Ryan Cinoman Constructing Explicit RIP Matrices July 18, 2018 1 / 36 Outline 1 Introduction 2 Restricted Isometry

More information

Making Flippy Floppy

Making Flippy Floppy Making Flippy Floppy James V. Burke UW Mathematics jvburke@uw.edu Aleksandr Y. Aravkin IBM, T.J.Watson Research sasha.aravkin@gmail.com Michael P. Friedlander UBC Computer Science mpf@cs.ubc.ca Vietnam

More information

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse

More information