Sparse analysis Lecture III: Dictionary geometry and greedy algorithms
|
|
- Camilla Willis
- 5 years ago
- Views:
Transcription
1 Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Anna C. Gilbert Department of Mathematics University of Michigan
2 Intuition from ONB Key step in algorithm: r, ϕ j = x c i ϕ i, ϕ j = x, ϕ j c i ϕ i, ϕ j
3 Coherence Definition The coherence of a dictionary µ = max j l ϕ j, ϕ l
4 Coherence Definition The coherence of a dictionary µ = max j l ϕ j, ϕ l φ1 φ1 φ2 φ3 φ2 φ3 Small coherence (good) Large coherence (bad)
5 Coherence: lower bound Theorem For a d N dictionary, N d µ d(n 1) 1. d [Welch 73] Theorem For most pairs of orthonormal bases in R d, the coherence between the two is ( log d ) µ = O. d [Donoho, Huo 99]
6 Large, incoherent dictionaries Fourier Dirac, N = 2d, µ = 1 d wavelet packets, N = d log d, µ = 1 2 There are large dictionaries with coherence close to the lower (Welch) bound; e.g., Kerdock codes, N = d 2, µ = 1/ d
7 Approximation algorithms (error) Sparse. Given k 1, solve arg min x Φc 2 s.t. c 0 k c i.e., find the best approximation of x using k atoms. c opt = optimal solution E opt = Φc opt x 2 = optimal error
8 Approximation algorithms (error) Sparse. Given k 1, solve arg min x Φc 2 s.t. c 0 k c i.e., find the best approximation of x using k atoms. c opt = optimal solution E opt = Φc opt x 2 = optimal error Algorithm returns ĉ with (1) ĉ 0 = k (2) E = Φĉ x 2 C 1 E opt
9 Approximation algorithms (error) Sparse. Given k 1, solve arg min x Φc 2 s.t. c 0 k c i.e., find the best approximation of x using k atoms. c opt = optimal solution E opt = Φc opt x 2 = optimal error Algorithm returns ĉ with (1) ĉ 0 = k (2) E = Φĉ x 2 C 1 E opt (Error) approximation ratio: E E opt = C 1E opt E opt = C 1
10 Approximation algorithms (terms) Algorithm returns ĉ with (1) ĉ 0 = C 2 k (2) E = Φĉ x 2 = E opt (Terms) approximation ratio: ĉ 0 c opt 0 = C 2k k = C 2
11 Bi-criteria approximation algorithms Algorithm returns ĉ with (1) ĉ 0 = C 2 k (2) E = Φĉ x 2 = C 1 E opt (Terms, Error) approximation ratio: (C 2, C 1 )
12 Greedy algorithms Build approximation one step at a time...
13 Greedy algorithms Build approximation one step at a time......choose the best atom at each step
14 Orthogonal Matching Pursuit OMP [Mallat 92], [Davis 97] Input. Dictionary Φ, signal x, steps k Output. Coefficient vector c with k nonzeros, Φc x
15 Orthogonal Matching Pursuit OMP [Mallat 92], [Davis 97] Input. Dictionary Φ, signal x, steps k Output. Coefficient vector c with k nonzeros, Φc x Initialize. counter t = 1, residual r 0 = x, c = 0
16 Orthogonal Matching Pursuit OMP [Mallat 92], [Davis 97] Input. Dictionary Φ, signal x, steps k Output. Coefficient vector c with k nonzeros, Φc x Initialize. counter t = 1, residual r 0 = x, c = 0 1. Greedy selection. Find atom ϕ jt s.t. j t = argmax l r t 1, ϕ l
17 Orthogonal Matching Pursuit OMP [Mallat 92], [Davis 97] Input. Dictionary Φ, signal x, steps k Output. Coefficient vector c with k nonzeros, Φc x Initialize. counter t = 1, residual r 0 = x, c = 0 1. Greedy selection. Find atom ϕ jt s.t. j t = argmax l r t 1, ϕ l 2. Update. Find c l1,..., c lt to solve min x c ls ϕ ls s 2
18 Orthogonal Matching Pursuit OMP [Mallat 92], [Davis 97] Input. Dictionary Φ, signal x, steps k Output. Coefficient vector c with k nonzeros, Φc x Initialize. counter t = 1, residual r 0 = x, c = 0 1. Greedy selection. Find atom ϕ jt s.t. j t = argmax l r t 1, ϕ l 2. Update. Find c l1,..., c lt to solve min x c ls ϕ ls s 2 new residual r t x Φc
19 Orthogonal Matching Pursuit OMP [Mallat 92], [Davis 97] Input. Dictionary Φ, signal x, steps k Output. Coefficient vector c with k nonzeros, Φc x Initialize. counter t = 1, residual r 0 = x, c = 0 1. Greedy selection. Find atom ϕ jt s.t. j t = argmax l r t 1, ϕ l 2. Update. Find c l1,..., c lt to solve min x c ls ϕ ls s 2 new residual r t x Φc 3. Iterate. t t + 1, stop when t > k.
20 Many greedy algorithms with similar outline Matching Pursuit: replace step 2. by c lt c lt + r t 1, ϕ kt Thresholding Choose m atoms where x, ϕ l are among m largest Alternate stopping rules: r t 2 ɛ max l r t, ϕ l ɛ Many other variations
21 Convergence of OMP Theorem Suppose Φ is a complete dictionary for R d. For any vector x, the residual after t steps of OMP satisfies r t 2 c 1 t. [Devore-Temlyakov]
22 Convergence of OMP Theorem Suppose Φ is a complete dictionary for R d. For any vector x, the residual after t steps of OMP satisfies r t 2 c 1 t. [Devore-Temlyakov] Even if x can be expressed sparsely, OMP may take d steps before the residual is zero.
23 Convergence of OMP Theorem Suppose Φ is a complete dictionary for R d. For any vector x, the residual after t steps of OMP satisfies r t 2 c 1 t. [Devore-Temlyakov] Even if x can be expressed sparsely, OMP may take d steps before the residual is zero. But, sometimes OMP correctly identifies sparse representations.
24 Sparse representation with OMP Suppose x has k-sparse representation x = l Λ c l ϕ l where Λ = k i.e., c opt is non-zero on Λ.
25 Sparse representation with OMP Suppose x has k-sparse representation x = l Λ c l ϕ l where Λ = k i.e., c opt is non-zero on Λ. Sufficient to find Λ When can OMP do so?
26 Sparse representation with OMP Suppose x has k-sparse representation x = l Λ c l ϕ l where Λ = k i.e., c opt is non-zero on Λ. Sufficient to find Λ When can OMP do so? Define Φ Λ = [ ] ϕ l1 ϕ l2 ϕ lk and l s Λ Ψ Λ = [ ] ϕ l1 ϕ l2 ϕ ln k l s / Λ
27 Sparse representation with OMP Suppose x has k-sparse representation x = l Λ c l ϕ l where Λ = k i.e., c opt is non-zero on Λ. Sufficient to find Λ When can OMP do so? Define Φ Λ = [ ] ϕ l1 ϕ l2 ϕ lk and l s Λ Ψ Λ = [ ] ϕ l1 ϕ l2 ϕ ln k l s / Λ Define greedy selection ratio ρ(r) = max l/ Λ r, ϕ l max l Λ r, ϕ l = Ψ T Λ r Φ T Λ r = max i.p. bad atoms max i.p. good atoms
28 Sparse representation with OMP Suppose x has k-sparse representation x = l Λ c l ϕ l where Λ = k i.e., c opt is non-zero on Λ. Sufficient to find Λ When can OMP do so? Define Φ Λ = [ ] ϕ l1 ϕ l2 ϕ lk and l s Λ Ψ Λ = [ ] ϕ l1 ϕ l2 ϕ ln k l s / Λ Define greedy selection ratio ρ(r) = max l/ Λ r, ϕ l max l Λ r, ϕ l = Ψ T Λ r Φ T Λ r = OMP chooses good atom iff ρ(r) < 1 max i.p. bad atoms max i.p. good atoms
29 Exact Recovery Condition Theorem (ERC) A sufficient condition for OMP to identify Λ after k steps is that Φ + Λ ϕ l 1 < 1 max l/ Λ where A + = (A T A) 1 A T. [Tropp 04]
30 Exact Recovery Condition Theorem (ERC) A sufficient condition for OMP to identify Λ after k steps is that Φ + Λ ϕ l 1 < 1 max l/ Λ where A + = (A T A) 1 A T. [Tropp 04] A + x is a coefficient vector that synthesizes best approximation of x using atoms in A. P = AA + orthogonal projector produces this best approximation
31 Proof. r 0 = x range(φ Λ )
32 Proof. r 0 = x range(φ Λ ) At iteration t + 1 assume l 1, l 2,..., l t Λ, thus a t range(φ Λ ) and r t = x a t range(φ Λ )
33 Proof. r 0 = x range(φ Λ ) At iteration t + 1 assume l 1, l 2,..., l t Λ, thus a t range(φ Λ ) and r t = x a t range(φ Λ ) Express orthogonal projector onto range(φ Λ ) as (Φ + Λ )T Φ T Λ, therefore (Φ + Λ )T Φ T Λ rt = rt.
34 Proof. r 0 = x range(φ Λ ) At iteration t + 1 assume l 1, l 2,..., l t Λ, thus a t range(φ Λ ) and r t = x a t range(φ Λ ) Express orthogonal projector onto range(φ Λ ) as (Φ + Λ )T Φ T Λ, therefore (Φ + Λ )T Φ T Λ rt = rt. Bound Ψ T Λ rt ρ(r t ) = = Φ T Λ rt Ψ T Λ (Φ+ Λ )T = Φ + Λ Ψ Λ 1 = max Φ + Λ ϕ l < 1 l / Λ 1 Ψ T Λ (Φ+ Λ )T Φ T Λ rt Φ T Λ rt
35 Proof. r 0 = x range(φ Λ ) At iteration t + 1 assume l 1, l 2,..., l t Λ, thus a t range(φ Λ ) and r t = x a t range(φ Λ ) Express orthogonal projector onto range(φ Λ ) as (Φ + Λ )T Φ T Λ, therefore (Φ + Λ )T Φ T Λ rt = rt. Bound Ψ T Λ rt ρ(r t ) = = Φ T Λ rt Ψ T Λ (Φ+ Λ )T = Φ + Λ Ψ Λ 1 = max Φ + Λ ϕ l < 1 l / Λ 1 Ψ T Λ (Φ+ Λ )T Φ T Λ rt Φ T Λ rt Then OMP selects an atom from Λ at iteration t and since it chooses a new atom at each iteration,
36 Proof. r 0 = x range(φ Λ ) At iteration t + 1 assume l 1, l 2,..., l t Λ, thus a t range(φ Λ ) and r t = x a t range(φ Λ ) Express orthogonal projector onto range(φ Λ ) as (Φ + Λ )T Φ T Λ, therefore (Φ + Λ )T Φ T Λ rt = rt. Bound Ψ T Λ rt ρ(r t ) = = Φ T Λ rt Ψ T Λ (Φ+ Λ )T = Φ + Λ Ψ Λ 1 = max Φ + Λ ϕ l < 1 l / Λ 1 Ψ T Λ (Φ+ Λ )T Φ T Λ rt Φ T Λ rt Then OMP selects an atom from Λ at iteration t and since it chooses a new atom at each iteration, After k iterations, chosen all atoms from Λ.
37 Coherence Bounds Theorem The ERC holds whenever k < 1 2 (µ 1 + 1). Therefore, OMP can recover any sufficiently sparse signals. [Tropp 04] For most redundant dictionaries, k < 1 2 ( d + 1).
38 Sparse Theorem Assume k 1 3 µ 1. For any vector x, the approximation Φĉ after k steps of OMP satisfies ĉ 0 k and x Φĉ k x Φc opt 2 where c opt is the best k-term approximation to x over Φ. [Tropp 04] Theorem Assume 4 k 1 µ. Two-phase greedy pursuit produces x = Φĉ s.t. x x 2 3 x Φc opt 2. Assume k 1 µ. Two-phase greedy pursuit produces x = Φĉ s.t. x x 2 ( 2µk 2 ) 1 + (1 2µk) 2 x Φc opt 2. [Gilbert, Strauss, Muthukrishnan, Tropp 03]
39 Summary Geometry of dictionary is important but get sufficient conditions on the geometry of the dictionary to solve Sparse problems efficiently. Algorithms are approximation algorithms (wrt error). Algorithmic architecture is greedy pursuit. Next lecture: other algorithmic formulations + solutions
An Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationSimultaneous Sparsity
Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp annacg martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d-dimensional,
More informationSparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery
Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan Connection between... Sparse Approximation and Compressed
More informationSparse analysis Lecture II: Hardness results for sparse approximation problems
Sparse analysis Lecture II: Hardness results for sparse approximation problems Anna C. Gilbert Department of Mathematics University of Michigan Sparse Problems Exact. Given a vector x R d and a complete
More informationAlgorithms for sparse analysis Lecture I: Background on sparse approximation
Algorithms for sparse analysis Lecture I: Background on sparse approximation Anna C. Gilbert Department of Mathematics University of Michigan Tutorial on sparse approximations and algorithms Compress data
More informationSensing systems limited by constraints: physical size, time, cost, energy
Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original
More informationGreedy Signal Recovery and Uniform Uncertainty Principles
Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles
More informationOrthogonal Matching Pursuit for Sparse Signal Recovery With Noise
Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationSparse Approximation of Signals with Highly Coherent Dictionaries
Sparse Approximation of Signals with Highly Coherent Dictionaries Bishnu P. Lamichhane and Laura Rebollo-Neira b.p.lamichhane@aston.ac.uk, rebollol@aston.ac.uk Support from EPSRC (EP/D062632/1) is acknowledged
More informationUniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 6-5-2008 Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
More informationSparse linear models
Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time
More informationDetecting Sparse Structures in Data in Sub-Linear Time: A group testing approach
Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Boaz Nadler The Weizmann Institute of Science Israel Joint works with Inbal Horev, Ronen Basri, Meirav Galun and Ery Arias-Castro
More informationDesigning Information Devices and Systems I Discussion 13B
EECS 6A Fall 7 Designing Information Devices and Systems I Discussion 3B. Orthogonal Matching Pursuit Lecture Orthogonal Matching Pursuit (OMP) algorithm: Inputs: A set of m songs, each of length n: S
More informationApplied Machine Learning for Biomedical Engineering. Enrico Grisan
Applied Machine Learning for Biomedical Engineering Enrico Grisan enrico.grisan@dei.unipd.it Data representation To find a representation that approximates elements of a signal class with a linear combination
More informationModel-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk
Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional
More informationEE 381V: Large Scale Optimization Fall Lecture 24 April 11
EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that
More informationInverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France
Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse
More informationMLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT
MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net
More informationSGN Advanced Signal Processing Project bonus: Sparse model estimation
SGN 21006 Advanced Signal Processing Project bonus: Sparse model estimation Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 12 Sparse models Initial problem: solve
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationNew Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit
New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence
More informationCoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles
CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal
More informationStability and Robustness of Weak Orthogonal Matching Pursuits
Stability and Robustness of Weak Orthogonal Matching Pursuits Simon Foucart, Drexel University Abstract A recent result establishing, under restricted isometry conditions, the success of sparse recovery
More informationSparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery
Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery Anna C. Gilbert Department of Mathematics University of Michigan Sparse signal recovery measurements:
More informationUniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit
Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationThe Sparsity Gap. Joel A. Tropp. Computing & Mathematical Sciences California Institute of Technology
The Sparsity Gap Joel A. Tropp Computing & Mathematical Sciences California Institute of Technology jtropp@acm.caltech.edu Research supported in part by ONR 1 Introduction The Sparsity Gap (Casazza Birthday
More informationRecovery Guarantees for Rank Aware Pursuits
BLANCHARD AND DAVIES: RECOVERY GUARANTEES FOR RANK AWARE PURSUITS 1 Recovery Guarantees for Rank Aware Pursuits Jeffrey D. Blanchard and Mike E. Davies Abstract This paper considers sufficient conditions
More informationCombining geometry and combinatorics
Combining geometry and combinatorics A unified approach to sparse signal recovery Anna C. Gilbert University of Michigan joint work with R. Berinde (MIT), P. Indyk (MIT), H. Karloff (AT&T), M. Strauss
More informationDesigning Information Devices and Systems I Spring 2018 Lecture Notes Note 25
EECS 6 Designing Information Devices and Systems I Spring 8 Lecture Notes Note 5 5. Speeding up OMP In the last lecture note, we introduced orthogonal matching pursuit OMP, an algorithm that can extract
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationThe Analysis Cosparse Model for Signals and Images
The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under
More informationSignal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit
Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Deanna Needell and Roman Vershynin Abstract We demonstrate a simple greedy algorithm that can reliably
More informationMismatch and Resolution in CS
Mismatch and Resolution in CS Albert Fannjiang, UC Davis Coworker: Wenjing Liao, UC Davis SPIE Optics + Photonics, San Diego, August 2-25, 20 Mismatch: gridding error Band exclusion Local optimization
More informationc 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE
METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.
More informationAnalysis of Greedy Algorithms
Analysis of Greedy Algorithms Jiahui Shen Florida State University Oct.26th Outline Introduction Regularity condition Analysis on orthogonal matching pursuit Analysis on forward-backward greedy algorithm
More informationCompressed Sensing and Redundant Dictionaries
Compressed Sensing and Redundant Dictionaries Holger Rauhut, Karin Schnass and Pierre Vandergheynst December 2, 2006 Abstract This article extends the concept of compressed sensing to signals that are
More informationA NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES
A NEW FRAMEWORK FOR DESIGNING INCOERENT SPARSIFYING DICTIONARIES Gang Li, Zhihui Zhu, 2 uang Bai, 3 and Aihua Yu 3 School of Automation & EE, Zhejiang Univ. of Sci. & Tech., angzhou, Zhejiang, P.R. China
More informationCopyright by Joel Aaron Tropp 2004
Copyright by Joel Aaron Tropp 2004 The Dissertation Committee for Joel Aaron Tropp certifies that this is the approved version of the following dissertation: Topics in Sparse Approximation Committee: Inderjit
More informationA Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia
A Tutorial on Compressive Sensing Simon Foucart Drexel University / University of Georgia CIMPA13 New Trends in Applied Harmonic Analysis Mar del Plata, Argentina, 5-16 August 2013 This minicourse acts
More informationPart IV Compressed Sensing
Aisenstadt Chair Course CRM September 2009 Part IV Compressed Sensing Stéphane Mallat Centre de Mathématiques Appliquées Ecole Polytechnique Conclusion to Super-Resolution Sparse super-resolution is sometime
More informationA new method on deterministic construction of the measurement matrix in compressed sensing
A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central
More informationof Orthogonal Matching Pursuit
A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement
More informationORTHOGONAL matching pursuit (OMP) is the canonical
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 9, SEPTEMBER 2010 4395 Analysis of Orthogonal Matching Pursuit Using the Restricted Isometry Property Mark A. Davenport, Member, IEEE, and Michael
More informationMultipath Matching Pursuit
Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy
More informationA simple test to check the optimality of sparse signal approximations
A simple test to check the optimality of sparse signal approximations Rémi Gribonval, Rosa Maria Figueras I Ventura, Pierre Vergheynst To cite this version: Rémi Gribonval, Rosa Maria Figueras I Ventura,
More informationRecent developments on sparse representation
Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last
More informationMATCHING-PURSUIT DICTIONARY PRUNING FOR MPEG-4 VIDEO OBJECT CODING
MATCHING-PURSUIT DICTIONARY PRUNING FOR MPEG-4 VIDEO OBJECT CODING Yannick Morvan, Dirk Farin University of Technology Eindhoven 5600 MB Eindhoven, The Netherlands email: {y.morvan;d.s.farin}@tue.nl Peter
More informationCompressed Sensing and Redundant Dictionaries
Compressed Sensing and Redundant Dictionaries Holger Rauhut, Karin Schnass and Pierre Vandergheynst Abstract This article extends the concept of compressed sensing to signals that are not sparse in an
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationMATCHING PURSUIT WITH STOCHASTIC SELECTION
2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 MATCHING PURSUIT WITH STOCHASTIC SELECTION Thomas Peel, Valentin Emiya, Liva Ralaivola Aix-Marseille Université
More informationSparse linear models and denoising
Lecture notes 4 February 22, 2016 Sparse linear models and denoising 1 Introduction 1.1 Definition and motivation Finding representations of signals that allow to process them more effectively is a central
More informationSparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images
Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images Alfredo Nava-Tudela ant@umd.edu John J. Benedetto Department of Mathematics jjb@umd.edu Abstract In this project we are
More informationSignal Recovery from Permuted Observations
EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,
More informationOn Rank Awareness, Thresholding, and MUSIC for Joint Sparse Recovery
On Rank Awareness, Thresholding, and MUSIC for Joint Sparse Recovery Jeffrey D. Blanchard a,1,, Caleb Leedy a,2, Yimin Wu a,2 a Department of Mathematics and Statistics, Grinnell College, Grinnell, IA
More informationSparsity in Underdetermined Systems
Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2
More informationA tutorial on sparse modeling. Outline:
A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing
More informationAbstract This paper is about the efficient solution of large-scale compressed sensing problems.
Noname manuscript No. (will be inserted by the editor) Optimization for Compressed Sensing: New Insights and Alternatives Robert Vanderbei and Han Liu and Lie Wang Received: date / Accepted: date Abstract
More informationPre-weighted Matching Pursuit Algorithms for Sparse Recovery
Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie
More informationUncertainty principles and sparse approximation
Uncertainty principles and sparse approximation In this lecture, we will consider the special case where the dictionary Ψ is composed of a pair of orthobases. We will see that our ability to find a sparse
More informationEquivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,
More informationReview: Learning Bimodal Structures in Audio-Visual Data
Review: Learning Bimodal Structures in Audio-Visual Data CSE 704 : Readings in Joint Visual, Lingual and Physical Models and Inference Algorithms Suren Kumar Vision and Perceptual Machines Lab 106 Davis
More informationBhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego
Bhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego 1 Outline Course Outline Motivation for Course Sparse Signal Recovery Problem Applications Computational
More informationJUST RELAX: CONVEX PROGRAMMING METHODS FOR SUBSET SELECTION AND SPARSE APPROXIMATION
JUST RELAX: CONVEX PROGRAMMING METHODS FOR SUBSET SELECTION AND SPARSE APPROXIMATION JOEL A TROPP Abstract Subset selection and sparse approximation problems request a good approximation of an input signal
More informationAN INTRODUCTION TO COMPRESSIVE SENSING
AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE
More informationTutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London
Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2
More informationThree Generalizations of Compressed Sensing
Thomas Blumensath School of Mathematics The University of Southampton June, 2010 home prev next page Compressed Sensing and beyond y = Φx + e x R N or x C N x K is K-sparse and x x K 2 is small y R M or
More informationRecovering Spikes from Noisy Neuronal Calcium Signals via Structured Sparse Approximation
Recovering Spikes from Noisy Neuronal Calcium Signals via Structured Sparse Approximation Eva L. Dyer r, Marco F. Duarte p, Don H. Johnson r, and Richard G. Baraniuk r r Dept. of Electrical and Computer
More informationSparse & Redundant Signal Representation, and its Role in Image Processing
Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique
More informationCompressed Sensing with Very Sparse Gaussian Random Projections
Compressed Sensing with Very Sparse Gaussian Random Projections arxiv:408.504v stat.me] Aug 04 Ping Li Department of Statistics and Biostatistics Department of Computer Science Rutgers University Piscataway,
More informationEfficient Inverse Cholesky Factorization for Alamouti Matrices in G-STBC and Alamouti-like Matrices in OMP
Efficient Inverse Cholesky Factorization for Alamouti Matrices in G-STBC and Alamouti-like Matrices in OMP Hufei Zhu, Ganghua Yang Communications Technology Laboratory Huawei Technologies Co Ltd, P R China
More informationLow-Complexity FPGA Implementation of Compressive Sensing Reconstruction
2013 International Conference on Computing, Networking and Communications, Multimedia Computing and Communications Symposium Low-Complexity FPGA Implementation of Compressive Sensing Reconstruction Jerome
More informationSketching for Large-Scale Learning of Mixture Models
Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical
More informationCompressed Sensing: Extending CLEAN and NNLS
Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction
More informationRandomness-in-Structured Ensembles for Compressed Sensing of Images
Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder
More informationLecture 22: More On Compressed Sensing
Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an
More informationThe Iteration-Tuned Dictionary for Sparse Representations
The Iteration-Tuned Dictionary for Sparse Representations Joaquin Zepeda #1, Christine Guillemot #2, Ewa Kijak 3 # INRIA Centre Rennes - Bretagne Atlantique Campus de Beaulieu, 35042 Rennes Cedex, FRANCE
More informationSparse representation classification and positive L1 minimization
Sparse representation classification and positive L1 minimization Cencheng Shen Joint Work with Li Chen, Carey E. Priebe Applied Mathematics and Statistics Johns Hopkins University, August 5, 2014 Cencheng
More informationA simple test to check the optimality of sparse signal approximations
A simple test to check the optimality of sparse signal approximations Rémi Gribonval, Rosa Maria Figueras I Ventura, Pierre Vandergheynst To cite this version: Rémi Gribonval, Rosa Maria Figueras I Ventura,
More informationReconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm
Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Jeevan K. Pant, Wu-Sheng Lu, and Andreas Antoniou University of Victoria May 21, 2012 Compressive Sensing 1/23
More informationSPARSE signal representations have gained popularity in recent
6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying
More informationRice University. Endogenous Sparse Recovery. Eva L. Dyer. A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree
Rice University Endogenous Sparse Recovery by Eva L. Dyer A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree Masters of Science Approved, Thesis Committee: Dr. Richard G. Baraniuk,
More informationList Decoding of Noisy Reed-Muller-like Codes
List Decoding of Noisy Reed-Muller-like Codes Martin J. Strauss University of Michigan Joint work with A. Robert Calderbank (Princeton) Anna C. Gilbert (Michigan) Joel Lepak (Michigan) Euclidean List Decoding
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More informationLecture: Introduction to Compressed Sensing Sparse Recovery Guarantees
Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin
More informationRapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs
Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs Mark Iwen Michigan State University October 18, 2014 Work with Janice (Xianfeng) Hu Graduating in May! M.A. Iwen (MSU) Fast
More informationThresholds for the Recovery of Sparse Solutions via L1 Minimization
Thresholds for the Recovery of Sparse Solutions via L Minimization David L. Donoho Department of Statistics Stanford University 39 Serra Mall, Sequoia Hall Stanford, CA 9435-465 Email: donoho@stanford.edu
More informationCompressive Sensing and Beyond
Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered
More informationIntroduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012
Introduction to Sparsity Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Outline Understanding Sparsity Total variation Compressed sensing(definition) Exact recovery with sparse prior(l 0 ) l 1 relaxation
More informationA Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades
A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades Ngoc Hung Nguyen 1, Hai-Tan Tran 2, Kutluyıl Doğançay 1 and Rocco Melino 2 1 University of South Australia 2
More informationBias-free Sparse Regression with Guaranteed Consistency
Bias-free Sparse Regression with Guaranteed Consistency Wotao Yin (UCLA Math) joint with: Stanley Osher, Ming Yan (UCLA) Feng Ruan, Jiechao Xiong, Yuan Yao (Peking U) UC Riverside, STATS Department March
More informationMachine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013
Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations
More informationSparse Approximation and Variable Selection
Sparse Approximation and Variable Selection Lorenzo Rosasco 9.520 Class 07 February 26, 2007 About this class Goal To introduce the problem of variable selection, discuss its connection to sparse approximation
More informationCoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp
CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell
More informationHighly sparse representations from dictionaries are unique and independent of the sparseness measure. R. Gribonval and M. Nielsen
AALBORG UNIVERSITY Highly sparse representations from dictionaries are unique and independent of the sparseness measure by R. Gribonval and M. Nielsen October 2003 R-2003-16 Department of Mathematical
More informationSignal Recovery, Uncertainty Relations, and Minkowski Dimension
Signal Recovery, Uncertainty Relations, and Minkowski Dimension Helmut Bőlcskei ETH Zurich December 2013 Joint work with C. Aubel, P. Kuppinger, G. Pope, E. Riegler, D. Stotz, and C. Studer Aim of this
More informationEstimating Galaxy Spectral Functions Using Sparse Composite Models
Estimating Galaxy Spectral Functions Using Sparse Composite Models Han Liu (Advisors: Larry Wasserman, Christopher Genovese and John Lafferty) Carnegie Mellon University January 10, 2007 ABSTRACT Galaxy
More informationRecovering overcomplete sparse representations from structured sensing
Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix
More informationSignal Sparsity Models: Theory and Applications
Signal Sparsity Models: Theory and Applications Raja Giryes Computer Science Department, Technion Michael Elad, Technion, Haifa Israel Sangnam Nam, CMI, Marseille France Remi Gribonval, INRIA, Rennes France
More informationALGORITHMS FOR SIMULTANEOUS SPARSE APPROXIMATION PART II: CONVEX RELAXATION
ALGORITHMS FOR SIMULTANEOUS SPARSE APPROXIMATION PART II: CONVEX RELAXATION JOEL A TROPP Abstract A simultaneous sparse approximation problem requests a good approximation of several input signals at once
More information