The Analysis Cosparse Model for Signals and Images

Size: px
Start display at page:

Download "The Analysis Cosparse Model for Signals and Images"

Transcription

1 The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under European Union's Seventh Framework Program, ERC Grant agreement no Sangnam Nam, CMI, Marseille France Michael Elad, Technion, Haifa Israel Remi Gribonval, INRIA, Rennes France Mike E. Davies University of Edinburgh Scotland

2 Outline Classical synthesis model versus the new analysis model. A recipe for generating algorithms for the new model. New set of tools that provide a theory with uniform recovery guarantees. Some open problems.

3 3 Agenda Analysis and Synthesis - Introduction From Synthesis to Analysis Experimental Results Theoretical Guarantees

4 4 Problem Setup Given the following linear measurements x, m, 0 y Mx e y e d Recover 0 from. e is the noise term. md M is the measurement matrix (m<d). In the noiseless case e 0. y

5 5 Synthesis Sparsity Prior x 0 Given that is k-sparse we can recover it stably from y under some proper assumption on M. The same holds if x 0 has a k-sparse representation under certain types of dn dictionaries : x Dz, z k D

6 6 Synthesis Sparsity Prior x Dz, z k Zero location Non-zero location x 0 z, k 5 0 D

7 7 Synthesis Minimization Problem The problem we aim at solving in synthesis is: xˆ zˆ arg min z s.t. y MDz 0 z Dzˆ The above can be approximated stably in a polynomial time under some proper assumptions on M and D.

8 8 Analysis Model Another way to model the sparsity of the signal is using the analysis model [Elad, Milanfar, and Rubinstein, 007]. Looks at the coefficients of x 0. pd is a given transform the analysis dictionary.

9 9 Analysis Model - Example Assume, x 0 44 x 0 x 0 Zero location Non-zero location x 0 3 non-zeros l p x 0 1 zeros What is the dimension of the subspace in which x 0 resides?

10 10 Cosparsity and Corank x 0 We are interested in the zeros in. Cosparsity l: contains l zeros. Cosupport : The zeros locations. Corank r: The rank of. If is in general position then l=r. In general x 0 resides in a subspace of dimension d-r. x 0

11 11 Synthesis versus Analysis Synthesis Sparsity k non-zeros Analysis Cosparsity l zeros

12 1 Synthesis versus Analysis Synthesis Sparsity k non-zeros Synthesis dictionary D dn Analysis Cosparsity l zeros Analysis dictionary pd

13 13 Synthesis versus Analysis Synthesis Sparsity k non-zeros Synthesis dictionary D Sparse representation x dn Dz z with (at most) k nonzeros Analysis Cosparsity l zeros Analysis dictionary pd Cosparse representation x with (at least) l zeros x p l 0

14 14 Synthesis versus Analysis Synthesis Sparsity k non-zeros Synthesis dictionary D dn Sparse representation x Dz z with (at most) k nonzeros D i - i-th column in D Analysis Cosparsity l zeros Analysis dictionary pd Cosparse representation x with (at least) l zeros x p l 0 - i-th row in i

15 15 Synthesis versus Analysis Synthesis Support T 1.. n - indices of the nonzeros in z Analysis Cosupport {1.. p} - indices of the zeros in x

16 16 Synthesis versus Analysis Synthesis Support T 1.. n - indices of the nonzeros in z x resides in a k dimensional subspace spanned by D D, i T T i Analysis Cosupport {1.. p} - indices of the zeros in x x resides in a d-r dimensional subspace orthogonal to the subspace spanned by, i i

17 17 Analysis Minimization Problem We work directly with and not with its representation. Generates a different family of signals. The signals are characterized by their behavior and not by their building blocks. The problem we aim at solving in analysis is xˆ arg min x s.t. y Mx x x 0

18 18 Phantom and Fourier Sampling Mx x0 0 Shepp-Logan phantom 1 sampled Fourier radial lines

19 19 Naïve recovery Naïve recovery l minimization result with no prior

20 0 Shepp-Logan Derivatives D-DIF x0 Result of applying D-Finite difference operator

21 1 Approximation Methods The analysis problem is hard just like the synthesis one and thus approximation techniques are required.

22 l 1 relaxation For synthesis zˆ arg min z s.t. y MDz For analysis z 1 xˆ argmin x s.t. y Mx x 1

23 3 More Approximation Techniques In synthesis we have many more approximation techniques OMP ROMP [Needell and Vershynin 009] CoSaMP [Needell and Tropp 009] SP [Dai and Milenkovic 009] IHT [Blumensath and Davies, 009] HTP [Foucart, 010] Can we convert them for the analysis framework?

24 4 Agenda Analysis and Synthesis - Introduction From Synthesis to Analysis Experimental Results Theoretical Guarantees Novel Part

25 5 Today s algorithms We present a general scheme for translating synthesis operations into analysis ones. This provides us with a recipe for converting the algorithms. The recipe is general and easy to use We apply this recipe on CoSaMP, SP, IHT and HTP. Theoretical study of analysis techniques is more complicated.

26 6 (Co)Sparse vectors addition Synthesis z 1 Given two vectors and with supports T 1 and T of sizes and k Task: find the support of z1 z Solution: supp z 1 z T1 T The maximal size of the support is z T k 1 T 1 k k 1 Analysis x 1 Given two vectors and x with cosupports and of sizes and Task: find the cosupport of x1 x Solution: cosupp x l1 x 1 The minimal size of the cosupport is l l p 1 1 l 1 1

27 7 Orthogonal Projection Synthesis Given a representation z and support T Task: Orthogonal projection onto the subspace supported on T Solution: keep elements in z supported on T and zero the rest z T Analysis Given a vector x and cosupport Task: Orthogonal projection onto the subspace orthogonal to the one spanned by Solution: calculate projection to the subspace spanned by and remove it from the vector Q I

28 8 Objective Aware Projection Synthesis Given a vector z and support T Task: Perform an objective aware projection onto the subspace spanned by Solution: solve The last has a simple closed form solution D T arg min y MDz s.t. z 0 z T C MD T z Analysis Given a vector x and cosupport Task: Perform an objective aware projection onto the subspace orthogonal to the one spanned by Solution: solve arg min x y Mx s.t. x 0 The last has a closed form solution

29 9 (Co)Support Selection Synthesis Given a non sparse vector z Task: Find the support T of the closest k-sparse vector Solution: Select the indices of largest k elements in z T supp z, k Optimal solution. Analysis Given a non cosparse vector x Task: Find the cosupport of the closest l-cosparse vector Solution: Select the indices of smallest l elements in x cosupp x, l This solution is suboptimal

30 30 Cosupport Selection Task: Find the cosupport of the closest l- cosparse vector to x arg min x Q x In some cases efficient algorithms exist: Simple thresholding for the unitary case. Dynamic programming for the one dimensional finite difference operator ( 1D-DIF ) [Han et al., 004] and the Fused-Lasso operator ( ) [Giryes et al., 013]. FUS 1D-DIF ;I

31 31 Near Optimal Cosupport Selection In general the cosupport selection problem for a general is an NP-hard problem [Gribonval et al., 013]. A cosupport selection scheme Sˆl is near optimal d with a constant C l if for any vector Reminder: v Q v v C inf x v Sˆ l z l is -cosaprse x l Q v I v Sˆ z Sˆ z Sˆ z l l l

32 3 Cosupport Selection Open Problems Find new types of analysis operators with optimal or near optimal cosupport selection schemes. An efficient global near optimal cosupport selection scheme for a given C l.

33 33 Subspace Pursuit (SP) r 0 y, t 0 T Output: xˆ xˆt Find new support elements * * t1 T supp D M r, k t t1 Update the residual: t r y Mxˆ xˆ t t t Dzˆ Support update: T T T Calculate temporal representations: z MD y p arg min s.t. z 0 T C T y MDz Calculate temporal representations: p z MD y arg min s.t. z 0 T C T y MDz Estimate k- sparse support: T supp z, k p [Dai and Milenkovic 009]

34 34 Analysis SP (ASP) r 0 y, t 0 [1.. p] Output: xˆ xˆt Find new cosupport elements ˆ * t 1 S M r al t t1 Update the residual: t r y Mxˆ t Cosupport update: Compute new solution: t xˆ arg min x s.t. 0 x y Mx Calculate temporal solution: x p arg min x s.t. 0 x y Mx Estimate l-cosparse cosupport: Sˆl x p [Giryes and Elad 01]

35 35 r T 0 Compressive Sampling Matching Pursuit (CoSaMP) y, t 0 Output: xˆ xˆt Find new support elements * * t 1 T supp D M r, k t t1 Update the residual: t r y Mxˆ xˆ t t t Dzˆ Support update: T T T Compute new solution: zˆ t z p T Calculate temporal representations: p z MD y arg min s.t. z 0 T C T y MDz Estimate k- sparse support: T supp z, k p [Needell and Tropp 009]

36 36 Analysis CoSaMP (ACoSaMP) r 0 y, t 0 [1.. p] Output: xˆ xˆt Find new cosupport elements S ˆal M r t r y Mxˆ * t 1 t t1 Update the residual: t Cosupport update: Compute new solution: xˆ t Qx p Calculate temporal solution: x p arg min x s.t. 0 x y Mx Estimate l-cosparse cosupport: S ˆl x p [Giryes and Elad 01]

37 37 AIHT and AHTP Applying the same scheme on iterative hard thresholding (IHT) and hard thresholding pursuit (HTP) result with their analysis versions AIHT and AHTP. [Giryes, Nam, Gribonval and Davies 011]

38 38 Algorithms Variations Relaxed ACoSaMP (RACoSaMP) and relaxed ASP (RASP) instead of arg min x y Mx x arg min y Mx s.t. x 0 x Easier to solve in high dimensional problems. [Giryes and Elad 01]

39 39 Other Analysis Existing Methods Greedy anlaysis pursuit (GAP) [Nam, Davies, Elad and Gribonval, 013] Selects the cosupport iteratively In each iteration removes elements from the cosupport in a greedy way Simple Thresholding Orthogonal backward greedy (OBG) an algorithm for the case M=I [Rubinstein, Peleg, Elad, 013].

40 40 Agenda Analysis and Synthesis - Introduction From Synthesis to Analysis Experimental Results Theoretical Guarantees Novel Part

41 41 Noiseless Signal Recovery Setup Synthetic noiseless compressed sensing experiment. is a transpose of a random tight frame. pd md d=00, p=40 (reminder:, M ). M is drawn from a random Gaussian ensemble. We draw a phase diagram: a grid of m d versus d l m. We repeat each experiment 50 times. We select a=1.

42 4 ACoSaMP ASP GAP l 1

43 43 High Dimensional Image Recovery is the D-finite differences operator. Recovering the Shepp-Logan Phantom. Few radial lines from two dimensional Fourier transform. 15/1 radial lines suffice for RACoSaMP/RASP - less than 5.77%/4.63% of the data. Performance similar to GAP. Better than l 1.

44 44 High Dimensional Image Recovery Shepp-Logan phantom 1 sampled radial lines

45 45 High Dimensional Image Stable Recovery An additive zero-mean white Gaussian noise. SNR of 0. Naïve recovery PSNR = 18dB Using radial lines for RASP and 5 radial lines for RACoSaMP. PSNR of 36dB. Same performance as GAPN.

46 46 High Dimensional Stable Recovery Naïve recovery RASP recovery

47 47 Agenda Analysis and Synthesis - Introduction From Synthesis to Analysis Experimental Results Theoretical Guarantees Novel Part

48 48 Restricted Isometry Property (RIP) We say that a matrix M satisfies the RIP [Candès and Tao, 006] with a constants if for every k-sparse vector it holds that 1 u Mu 1 u. k u k k

49 49 Algorithm Guarantees Given that z 0 is a k-sparse vector and that MD satisfies the Restricted Isometry Property (RIP) with a constant bk algorithm then after a constant number of iterations i *, SP, CoSaMP, IHT and HTP satisfy: ˆi * z z c e 0 1 where c 1 is a given constant (the constant of each algorithm is different). [Needell and Tropp 009, Dai and Milenkovic 009, Blumensath and Davies 009, Foucart 010]

50 50 -RIP We say that a matrix M satisfies the -RIP with a constant if for every l-cosparse vector it l holds that l l 1 v Mv 1 v. v

51 51 Near Optimal Cosupport Selection Reminder: A cosupport selection scheme is near optimal d with a constant C l if for any vector Q v v C inf x v Sˆ l z l is -cosaprse x l S ˆl Q v I v Sˆ z Sˆ z Sˆ z l l l v

52 5 Reconstruction Guarantees Theorem: Apply either ACoSaMP or ASP with a=(l-p)/l, obtaining after t iterations. For an appropriate value of C max C l, Cl p there exists a reference constant C, M greater than zero such that if 4l3 p C, M then after a finite number of iterations t * * t ˆ, x x c e 0 where c <1 is a function of x ˆt 4l3 p, Cl, Clp and M. [Giryes, Nam, Elad, Gribonval and Davies, 013]

53 53 Special Cases Optimal Projection Given an optimal projection scheme the condition of the theorem is simply l When is unitary the RIP and -RIP coincide and the condition becomes p M * 4k

54 54 AIHT and AHTP AIHT and AHTP have a similar theorem to ASP and ACoSaMP. Given an optimal projection scheme AIHT and AHTP have stable recovery if In the unitary case k 1/3. l p M * 1/ 3.

55 55 -RIP Matrices p Theorem: for a fixed d md, if M satisfies CM m for any z P Mz z z e then, for any constant l 0 we have l l with probability exceeding 1 e t if 3 9 p m p llog t CM l p l l [Blumensath, Davies, 009], [Giryes, Nam, Elad, Gribonval and Davies, 013]

56 56 Linear Dependencies in For -RIP the number of measurements we need is m O p llog p If is in a general position then l<d => if p=d we have m>d If we allow linear dependencies then l can be greater than d. =>if p=d we can have m<d. Conclusion: Linear dependencies are permitted in and even encouraged d p

57 57 -RIP Open Questions Johnson-Lindenstrauss matrices are also RIP and -RIP matrices. Is there an advantage for the -RIP over the RIP? Can we find a family of matrices that has the -RIP but not the RIP? Can we get a guarantee which is in terms of d-r (r is the corank) instead of p-l? What is coherence in the analysis model?

58 58 Related Work D-RIP based recovery guarantees for l 1 - minimization [Candès, Eldar, Needell, Randall, 011], [Liu, Mi, Li, 01]. ERC like based recovery conditions for GAP [Nam, Davies, Elad and Gribonval, 013] and l 1 - minimization [Vaiter,Peyre,Dossal,Fadili, 013]. D-RIP based TV recovery guarantees [Needell, Ward, 013] Thresholding denoising (M=I) performance for the case of Gaussian noise [Peleg, Elad, 013].

59 59 Implications for the Synthesis Model Classically, in the synthesis model linear dependencies between small groups of columns are not allowed. However, in the analysis model, as our target is the signal, linear dependencies are even encouraged. Does the same hold for synthesis?

60 60 Implication for the Synthesis Model Signal space CoSaMP signal recovery under the synthesis model also when the dictionary is highly coherent [Davenport, Needell, Wakin, 013]. An easy extension of our analysis theoretical guarantees also to synthesis [Giryes, Elad, 013]. The uniqueness and stability conditions for the signal recovery and the representation recovery are not the same [Giryes, Elad, 013].

61 61 Conclusion New sparsity model the analysis model. The zeros contain the information. A general recipe for converting standard synthesis techniques into analysis ones. New theory and tools. Linear dependencies are a good thing in the analysis dictionary. Implications on the synthesis model.

62 6

63 63

64 64 Reconstruction Guarantees Theorem: Apply either ACoSaMP or ASP with a=(l-p)/l, obtaining after t iterations. If C 1 M / C 1(*) and 4l3 p C, M, where C max C l, Cl p and is a constant greater than zero C, M whenever (*) is satisfied, then after a finite number of iterations t * * t ˆ, x x c e 0 where c <1 is a function of x ˆt 4l3 p, Cl, Clp and M. [Giryes, Nam, Elad, Gribonval and Davies, 013]

65 65 Iterative hard thresholding (IHT) Iterative hard thresholding (IHT) [Blumensath and Davies, 009] has two steps: Gradient step with step size : z z MD y MDz 1 ˆ ˆ. i i i Projection to the closest k-sparse vector: zˆ z. T i1 i1 k k is a hard thresholding operator that keeps the k largest elements and zeros the rest.

66 66 Hard thresholding pursuit(htp) Hard Thresholding Pursuit (HTP) [Foucart, 010] is similar to IHT but differ in the projection step Instead of projecting to the closest k-sparse vector, takes the support T i+1 of z i 1 and k minimizes the fidelity term: zˆ arg min y MD z i1 z i1 1 * * * * T T T D M MD D M y i1 i1 i1 T

67 67 Changing step size Instead of using a constant step size, one can choose in each iteration the step size that minimizes the fidelity term [V. Cevher, 011]: y MDzˆi 1

68 68 A-IHT and A-HTP gradient step cosupport selection: Projection: For A-IHT: ˆ Reminder: For A-HTP: x x M y Mx ˆ T ˆ 1 ˆ cosupp x, l i i i x Q x Q xˆ y Mx i1 i1 i1 i1 i1 I i1 i1 arg min i1 x s.t. x0 ˆ i 1 i1 i1 i1 T ˆ ˆ T Q M M Q M y ˆ 0 i1 [Giryes, Nam, Gribonval and Davies 011]

69 69 Algorithms variations Optimal changing stepsize instead of a fixed one: ˆ i 1 arg min y Mxi 1 Approximated changing stepsize i 1 arg min y Mxi 1 Underestimate the cosparsity and use l l.

70 70 Reconstruction Guarantees Theorem: Apply either AIHT or AHTP with a certain constant step size or optimal step size obtaining x t after t iterations. If C 1 / C 1(*) l M l and l p 1 Cl, M, where 1 C, M is a constant greater than zero whenever (*) is satisfied, then after a finite number of iterations t * * t ˆ, x x c e 0 1 where c 1 <1 is a function of l p, Cl, and M. [Giryes, Nam, Elad, Gribonval and Davies, 013]

71 71 Special Case Unitary Transform When is unitary the condition of the first theorems is simply M * k 1/ 3

Signal Sparsity Models: Theory and Applications

Signal Sparsity Models: Theory and Applications Signal Sparsity Models: Theory and Applications Raja Giryes Computer Science Department, Technion Michael Elad, Technion, Haifa Israel Sangnam Nam, CMI, Marseille France Remi Gribonval, INRIA, Rennes France

More information

Greedy Signal Recovery and Uniform Uncertainty Principles

Greedy Signal Recovery and Uniform Uncertainty Principles Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles

More information

EUSIPCO

EUSIPCO EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,

More information

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional

More information

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal

More information

Generalized Orthogonal Matching Pursuit- A Review and Some

Generalized Orthogonal Matching Pursuit- A Review and Some Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Multipath Matching Pursuit

Multipath Matching Pursuit Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

of Orthogonal Matching Pursuit

of Orthogonal Matching Pursuit A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

Generalized greedy algorithms.

Generalized greedy algorithms. Generalized greedy algorithms. François-Xavier Dupé & Sandrine Anthoine LIF & I2M Aix-Marseille Université - CNRS - Ecole Centrale Marseille, Marseille ANR Greta Séminaire Parisien des Mathématiques Appliquées

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Noisy Signal Recovery via Iterative Reweighted L1-Minimization Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

The Cosparse Analysis Model and Algorithms

The Cosparse Analysis Model and Algorithms The Cosparse Analysis Model and Algorithms Sangnam Nam, Mike E. Davies, Michael Elad, Rémi Gribonval To cite this version: Sangnam Nam, Mike E. Davies, Michael Elad, Rémi Gribonval. The Cosparse Analysis

More information

Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit

Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Deanna Needell and Roman Vershynin Abstract We demonstrate a simple greedy algorithm that can reliably

More information

The Cosparse Analysis Model and Algorithms

The Cosparse Analysis Model and Algorithms The Cosparse Analysis Model and Algorithms S. Nam a, M. E. Davies b, M. Elad c, R. Gribonval a a Centre de Recherche INRIA Rennes - Bretagne Atlantique, Campus de Beaulieu, F-35042 Rennes, France b School

More information

A new method on deterministic construction of the measurement matrix in compressed sensing

A new method on deterministic construction of the measurement matrix in compressed sensing A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central

More information

Robust Sparse Recovery via Non-Convex Optimization

Robust Sparse Recovery via Non-Convex Optimization Robust Sparse Recovery via Non-Convex Optimization Laming Chen and Yuantao Gu Department of Electronic Engineering, Tsinghua University Homepage: http://gu.ee.tsinghua.edu.cn/ Email: gyt@tsinghua.edu.cn

More information

Analysis of Greedy Algorithms

Analysis of Greedy Algorithms Analysis of Greedy Algorithms Jiahui Shen Florida State University Oct.26th Outline Introduction Regularity condition Analysis on orthogonal matching pursuit Analysis on forward-backward greedy algorithm

More information

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Introduction to Sparsity Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Outline Understanding Sparsity Total variation Compressed sensing(definition) Exact recovery with sparse prior(l 0 ) l 1 relaxation

More information

Sketching for Large-Scale Learning of Mixture Models

Sketching for Large-Scale Learning of Mixture Models Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical

More information

ORTHOGONAL matching pursuit (OMP) is the canonical

ORTHOGONAL matching pursuit (OMP) is the canonical IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 9, SEPTEMBER 2010 4395 Analysis of Orthogonal Matching Pursuit Using the Restricted Isometry Property Mark A. Davenport, Member, IEEE, and Michael

More information

Sparse analysis Lecture III: Dictionary geometry and greedy algorithms

Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Anna C. Gilbert Department of Mathematics University of Michigan Intuition from ONB Key step in algorithm: r, ϕ j = x c i ϕ i, ϕ j

More information

Sparse & Redundant Signal Representation, and its Role in Image Processing

Sparse & Redundant Signal Representation, and its Role in Image Processing Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique

More information

Compressive sensing in the analog world

Compressive sensing in the analog world Compressive sensing in the analog world Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Compressive Sensing A D -sparse Can we really acquire analog signals

More information

Stability and Robustness of Weak Orthogonal Matching Pursuits

Stability and Robustness of Weak Orthogonal Matching Pursuits Stability and Robustness of Weak Orthogonal Matching Pursuits Simon Foucart, Drexel University Abstract A recent result establishing, under restricted isometry conditions, the success of sparse recovery

More information

SOS Boosting of Image Denoising Algorithms

SOS Boosting of Image Denoising Algorithms SOS Boosting of Image Denoising Algorithms Yaniv Romano and Michael Elad The Technion Israel Institute of technology Haifa 32000, Israel The research leading to these results has received funding from

More information

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Simultaneous Sparsity

Simultaneous Sparsity Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp annacg martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d-dimensional,

More information

Three Generalizations of Compressed Sensing

Three Generalizations of Compressed Sensing Thomas Blumensath School of Mathematics The University of Southampton June, 2010 home prev next page Compressed Sensing and beyond y = Φx + e x R N or x C N x K is K-sparse and x x K 2 is small y R M or

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France. Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems

More information

Physics-driven structured cosparse modeling for source localization

Physics-driven structured cosparse modeling for source localization Physics-driven structured cosparse modeling for source localization Sangnam Nam, Rémi Gribonval To cite this version: Sangnam Nam, Rémi Gribonval. Physics-driven structured cosparse modeling for source

More information

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1239 Preconditioning for Underdetermined Linear Systems with Sparse Solutions Evaggelia Tsiligianni, StudentMember,IEEE, Lisimachos P. Kondi,

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

The Fundamentals of Compressive Sensing

The Fundamentals of Compressive Sensing The Fundamentals of Compressive Sensing Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Sensor Explosion Data Deluge Digital Revolution If we sample a signal

More information

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan Connection between... Sparse Approximation and Compressed

More information

arxiv: v1 [math.na] 26 Nov 2009

arxiv: v1 [math.na] 26 Nov 2009 Non-convexly constrained linear inverse problems arxiv:0911.5098v1 [math.na] 26 Nov 2009 Thomas Blumensath Applied Mathematics, School of Mathematics, University of Southampton, University Road, Southampton,

More information

COMPRESSED Sensing (CS) is a method to recover a

COMPRESSED Sensing (CS) is a method to recover a 1 Sample Complexity of Total Variation Minimization Sajad Daei, Farzan Haddadi, Arash Amini Abstract This work considers the use of Total Variation (TV) minimization in the recovery of a given gradient

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

Stochastic geometry and random matrix theory in CS

Stochastic geometry and random matrix theory in CS Stochastic geometry and random matrix theory in CS IPAM: numerical methods for continuous optimization University of Edinburgh Joint with Bah, Blanchard, Cartis, and Donoho Encoder Decoder pair - Encoder/Decoder

More information

Robust multichannel sparse recovery

Robust multichannel sparse recovery Robust multichannel sparse recovery Esa Ollila Department of Signal Processing and Acoustics Aalto University, Finland SUPELEC, Feb 4th, 2015 1 Introduction 2 Nonparametric sparse recovery 3 Simulation

More information

On Rank Awareness, Thresholding, and MUSIC for Joint Sparse Recovery

On Rank Awareness, Thresholding, and MUSIC for Joint Sparse Recovery On Rank Awareness, Thresholding, and MUSIC for Joint Sparse Recovery Jeffrey D. Blanchard a,1,, Caleb Leedy a,2, Yimin Wu a,2 a Department of Mathematics and Statistics, Grinnell College, Grinnell, IA

More information

A NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES

A NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES A NEW FRAMEWORK FOR DESIGNING INCOERENT SPARSIFYING DICTIONARIES Gang Li, Zhihui Zhu, 2 uang Bai, 3 and Aihua Yu 3 School of Automation & EE, Zhejiang Univ. of Sci. & Tech., angzhou, Zhejiang, P.R. China

More information

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.

More information

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,

More information

Complementary Matching Pursuit Algorithms for Sparse Approximation

Complementary Matching Pursuit Algorithms for Sparse Approximation Complementary Matching Pursuit Algorithms for Sparse Approximation Gagan Rath and Christine Guillemot IRISA-INRIA, Campus de Beaulieu 35042 Rennes, France phone: +33.2.99.84.75.26 fax: +33.2.99.84.71.71

More information

Recovery of Sparse Signals Using Multiple Orthogonal Least Squares

Recovery of Sparse Signals Using Multiple Orthogonal Least Squares Recovery of Sparse Signals Using Multiple Orthogonal east Squares Jian Wang, Ping i Department of Statistics and Biostatistics arxiv:40.505v [stat.me] 9 Oct 04 Department of Computer Science Rutgers University

More information

A discretized Newton flow for time varying linear inverse problems

A discretized Newton flow for time varying linear inverse problems A discretized Newton flow for time varying linear inverse problems Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München Arcisstrasse

More information

Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity

Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity Wei Dai and Olgica Milenkovic Department of Electrical and Computer Engineering University of Illinois at Urbana-Champaign

More information

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie

More information

A tutorial on sparse modeling. Outline:

A tutorial on sparse modeling. Outline: A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Machine Learning for Signal Processing Sparse and Overcomplete Representations Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA

More information

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices Compressed Sensing Yonina C. Eldar Electrical Engineering Department, Technion-Israel Institute of Technology, Haifa, Israel, 32000 1 Introduction Compressed sensing (CS) is an exciting, rapidly growing

More information

Least squares regularized or constrained by L0: relationship between their global minimizers. Mila Nikolova

Least squares regularized or constrained by L0: relationship between their global minimizers. Mila Nikolova Least squares regularized or constrained by L0: relationship between their global minimizers Mila Nikolova CMLA, CNRS, ENS Cachan, Université Paris-Saclay, France nikolova@cmla.ens-cachan.fr SIAM Minisymposium

More information

Gradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property

Gradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property : An iterative algorithm for sparse recovery with restricted isometry property Rahul Garg grahul@us.ibm.com Rohit Khandekar rohitk@us.ibm.com IBM T. J. Watson Research Center, 0 Kitchawan Road, Route 34,

More information

arxiv: v2 [cs.lg] 6 May 2017

arxiv: v2 [cs.lg] 6 May 2017 arxiv:170107895v [cslg] 6 May 017 Information Theoretic Limits for Linear Prediction with Graph-Structured Sparsity Abstract Adarsh Barik Krannert School of Management Purdue University West Lafayette,

More information

Recovery of Compressible Signals in Unions of Subspaces

Recovery of Compressible Signals in Unions of Subspaces 1 Recovery of Compressible Signals in Unions of Subspaces Marco F. Duarte, Chinmay Hegde, Volkan Cevher, and Richard G. Baraniuk Department of Electrical and Computer Engineering Rice University Abstract

More information

A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades Ngoc Hung Nguyen 1, Hai-Tan Tran 2, Kutluyıl Doğançay 1 and Rocco Melino 2 1 University of South Australia 2

More information

MATCHING PURSUIT WITH STOCHASTIC SELECTION

MATCHING PURSUIT WITH STOCHASTIC SELECTION 2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 MATCHING PURSUIT WITH STOCHASTIC SELECTION Thomas Peel, Valentin Emiya, Liva Ralaivola Aix-Marseille Université

More information

Inverse problems, Dictionary based Signal Models and Compressed Sensing

Inverse problems, Dictionary based Signal Models and Compressed Sensing Inverse problems, Dictionary based Signal Models and Compressed Sensing Rémi Gribonval METISS project-team (audio signal processing, speech recognition, source separation) INRIA, Rennes, France Ecole d

More information

Stopping Condition for Greedy Block Sparse Signal Recovery

Stopping Condition for Greedy Block Sparse Signal Recovery Stopping Condition for Greedy Block Sparse Signal Recovery Yu Luo, Ronggui Xie, Huarui Yin, and Weidong Wang Department of Electronics Engineering and Information Science, University of Science and Technology

More information

On the Minimization Over Sparse Symmetric Sets: Projections, O. Projections, Optimality Conditions and Algorithms

On the Minimization Over Sparse Symmetric Sets: Projections, O. Projections, Optimality Conditions and Algorithms On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions and Algorithms Amir Beck Technion - Israel Institute of Technology Haifa, Israel Based on joint work with Nadav Hallak

More information

Greedy Sparsity-Constrained Optimization

Greedy Sparsity-Constrained Optimization Greedy Sparsity-Constrained Optimization Sohail Bahmani, Petros Boufounos, and Bhiksha Raj 3 sbahmani@andrew.cmu.edu petrosb@merl.com 3 bhiksha@cs.cmu.edu Department of Electrical and Computer Engineering,

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Abstract This paper is about the efficient solution of large-scale compressed sensing problems.

Abstract This paper is about the efficient solution of large-scale compressed sensing problems. Noname manuscript No. (will be inserted by the editor) Optimization for Compressed Sensing: New Insights and Alternatives Robert Vanderbei and Han Liu and Lie Wang Received: date / Accepted: date Abstract

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

A Tight Bound of Hard Thresholding

A Tight Bound of Hard Thresholding Journal of Machine Learning Research 18 018) 1-4 Submitted 6/16; Revised 5/17; Published 4/18 A Tight Bound of Hard Thresholding Jie Shen Department of Computer Science Rutgers University Piscataway, NJ

More information

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion Israel Institute o technology Haia 3000, Israel * SIAM Conerence on Imaging Science

More information

Sparsity in Underdetermined Systems

Sparsity in Underdetermined Systems Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2

More information

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and the Coordinated Science Laboratory, University

More information

Recovery Guarantees for Rank Aware Pursuits

Recovery Guarantees for Rank Aware Pursuits BLANCHARD AND DAVIES: RECOVERY GUARANTEES FOR RANK AWARE PURSUITS 1 Recovery Guarantees for Rank Aware Pursuits Jeffrey D. Blanchard and Mike E. Davies Abstract This paper considers sufficient conditions

More information

Observability of a Linear System Under Sparsity Constraints

Observability of a Linear System Under Sparsity Constraints 2372 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 58, NO 9, SEPTEMBER 2013 Observability of a Linear System Under Sparsity Constraints Wei Dai and Serdar Yüksel Abstract Consider an -dimensional linear

More information

Efficient Sparse Recovery Pursuits with Least Squares

Efficient Sparse Recovery Pursuits with Least Squares Efficient Sparse Recovery Pursuits with Least Squares Guy Leibovitz, Raja Giryes guyleibovitz@mail.tau.ac.il, raja@tauex.tau.ac.il School of Electrical Engineering, Tel-Aviv University Tel-Aviv, Israel

More information

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar

More information

A Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia

A Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia A Tutorial on Compressive Sensing Simon Foucart Drexel University / University of Georgia CIMPA13 New Trends in Applied Harmonic Analysis Mar del Plata, Argentina, 5-16 August 2013 This minicourse acts

More information

Fast Hard Thresholding with Nesterov s Gradient Method

Fast Hard Thresholding with Nesterov s Gradient Method Fast Hard Thresholding with Nesterov s Gradient Method Volkan Cevher Idiap Research Institute Ecole Polytechnique Federale de ausanne volkan.cevher@epfl.ch Sina Jafarpour Department of Computer Science

More information

Compressive Sensing Recovery of Spike Trains Using A Structured Sparsity Model

Compressive Sensing Recovery of Spike Trains Using A Structured Sparsity Model Compressive Sensing Recovery of Spike Trains Using A Structured Sparsity Model Chinmay Hegde, Marco F. Duarte, Volkan Cevher To cite this version: Chinmay Hegde, Marco F. Duarte, Volkan Cevher. Compressive

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations

More information

Block-sparse Solutions using Kernel Block RIP and its Application to Group Lasso

Block-sparse Solutions using Kernel Block RIP and its Application to Group Lasso Block-sparse Solutions using Kernel Block RIP and its Application to Group Lasso Rahul Garg IBM T.J. Watson research center grahul@us.ibm.com Rohit Khandekar IBM T.J. Watson research center rohitk@us.ibm.com

More information

Interpolation via weighted l 1 minimization

Interpolation via weighted l 1 minimization Interpolation via weighted l 1 minimization Rachel Ward University of Texas at Austin December 12, 2014 Joint work with Holger Rauhut (Aachen University) Function interpolation Given a function f : D C

More information

Sparse recovery for spherical harmonic expansions

Sparse recovery for spherical harmonic expansions Rachel Ward 1 1 Courant Institute, New York University Workshop Sparsity and Cosmology, Nice May 31, 2011 Cosmic Microwave Background Radiation (CMB) map Temperature is measured as T (θ, ϕ) = k k=0 l=

More information

Optimization for Compressed Sensing

Optimization for Compressed Sensing Optimization for Compressed Sensing Robert J. Vanderbei 2014 March 21 Dept. of Industrial & Systems Engineering University of Florida http://www.princeton.edu/ rvdb Lasso Regression The problem is to solve

More information

Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice

Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice Jason N. Laska, Mark A. Davenport, Richard G. Baraniuk Department of Electrical and Computer Engineering Rice University

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

Iterative Hard Thresholding for Compressed Sensing

Iterative Hard Thresholding for Compressed Sensing Iterative Hard Thresholding for Compressed Sensing Thomas lumensath and Mike E. Davies 1 Abstract arxiv:0805.0510v1 [cs.it] 5 May 2008 Compressed sensing is a technique to sample compressible signals below

More information

Rui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China

Rui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China Acta Mathematica Sinica, English Series May, 015, Vol. 31, No. 5, pp. 755 766 Published online: April 15, 015 DOI: 10.1007/s10114-015-434-4 Http://www.ActaMath.com Acta Mathematica Sinica, English Series

More information

Phase Transitions for Greedy Sparse Approximation Algorithms

Phase Transitions for Greedy Sparse Approximation Algorithms Phase Transitions for Greedy parse Approximation Algorithms Jeffrey D. Blanchard,a,, Coralia Cartis b, Jared Tanner b,, Andrew Thompson b a Department of Mathematics and tatistics, Grinnell College, Grinnell,

More information

Sparse Legendre expansions via l 1 minimization

Sparse Legendre expansions via l 1 minimization Sparse Legendre expansions via l 1 minimization Rachel Ward, Courant Institute, NYU Joint work with Holger Rauhut, Hausdorff Center for Mathematics, Bonn, Germany. June 8, 2010 Outline Sparse recovery

More information