Sparse Legendre expansions via l 1 minimization

Size: px
Start display at page:

Download "Sparse Legendre expansions via l 1 minimization"

Transcription

1 Sparse Legendre expansions via l 1 minimization Rachel Ward, Courant Institute, NYU Joint work with Holger Rauhut, Hausdorff Center for Mathematics, Bonn, Germany. June 8, 2010

2 Outline Sparse recovery for bounded orthonormal systems Sparse recovery for (certain) unbounded orthonormal systems From compressive sensing to function approximation Applications

3 Outline Sparse recovery for bounded orthonormal systems Sparse recovery for (certain) unbounded orthonormal systems From compressive sensing to function approximation Applications

4 Problem set-up Consider the general problem: reconstruct an s-sparse vector x C N or x R N (or a compressible vector) from its vector of m measurements y = Ax Suppose that the measurements correspond to underdetermined system: s < m < N, so that the system y = Ax has infinitely many solutions. We want a sparse solution. Preferably we would like to have a fast algorithm that performs the sparse reconstruction.

5 5 Algorithms for sparse recovery l 0 -minimization: x 0 := supp x, min x 0 subject to Ax = y. x C N Problem: l 0 -minimization is NP hard. l 1 minimization: min x x 1 = N x j subject to Ax = y j=1 This is the convex relaxation of l 0 -minimization problem, can be solved efficiently.

6 A sufficient condition for sparse recovery The restricted isometry constant δ s of a matrix A C m N is defined as the smallest δ s such that for all s-sparse x C N. (1 δ s ) x 2 2 Ax 2 2 (1 + δ s ) x 2 2 (Candes, Romberg, Tao 2005, Foucart/Lai 2009): If A R m N satisfies RIP of order 2s, and if δ 2s.46..., then any vector x R N may be approximated from y = Ax + ξ with ξ 2 ɛ by up to error x = min z z 1 subject to Az y 2 ɛ x x 2 C 1 x x s 1 s + C 2 ɛ.

7 Which matrices satisfy the restricted isometry property? If A R m N is a Gaussian or Bernoulli random matrix, m = O(s ln(n/s)) measurements ensure that A satisfy the restricted isometry property of order s with high probability. In contrast, all available deterministic constructions run into quadratic bottleneck: require m = O(s 2 log N) measurements. Compromise between completely random and deterministic: structured random matrices.

8 Structured random matrices: the random partial Fourier matrix Consider the discrete Fourier matrix F C N N with entries F l,k = 1 N e 2πilk/N, l, k = 1,..., N, Randomly subsample m < N rows from F to get the m N random partial Fourier matrix. The measurements y l = (Fx) l are a random subsample of the DFT of x. Theorem (Candes, Romberg, Tao 2005; Rudelson, Vershynin 2008) If m O(s ln 4 (N)), then with high probability, the m N random partial Fourier matrix satisfies the restricted isometry property of order s.

9 The random partial Fourier matrix: Function approximation interpretation Random partial Fourier measurements correspond to samples of a trigonometric polynomial: y l = (Fx) l = g x (t l ), t l = 2πj l /N, where g x (t) := N 1 k=0 x ke ikt. We say that g x (t) is an s-sparse trigonometric polynomial of degree N 1 if the coefficient vector (x k ) N 1 k=0 is s-sparse. RIP of the random partial Fourier matrix means that all s-sparse trigonometric polynomials of degree N can be recovered from their values on a fixed subset of m = O(s log 4 N) measurements of the form t l = 2πl/N.

10 Outline Sparse recovery for bounded orthonormal systems Sparse recovery for (certain) unbounded orthonormal systems From compressive sensing to function approximation Applications

11 Sparse recovery for more general orthonormal systems? We may consider the recovery of functions f (u) = N 1 k=0 x k ψ k (u), u D R d that are sparse relative to a function system ψ 1,..., ψ N : D C that is orthonormal with respect to measure ν on D, from a number m < N of point samples y l = f (u l ) = N x k ψ k (u l ), u 1,..., u m ν. k=0 where m is proportional to the sparsity level s. Whether sparse recovery is possible depends on the structure of the sampling matrix A l,k = ψ k (u l ) associated to the orthonormal system

12 Sparse recovery for general bounded orthonormal systems Sampling matrix: A l,k = ψ k (u l ), l [m], k [N] Theorem (Rauhut 09) Suppose that ψ k K for all k [N]. Let A C m N be the sampling matrix associated to the bounded orthonormal system {ψ j } N 1 j=0 with sampling points u 1,..., u m D chosen i.i.d. from the orthogonalization measure ν. If m O(K 2 s ln 4 (N)), then with probability exceeding 1 N log3 N, the matrix 1 m A satisfies RIP of order s. Examples of uniformly bounded orthonormal systems: The complex exponential basis (with uniform orthogonalization measure) and Chebyshev polynomial basis, orthonormal on [ 1, 1] w.r.t. the Chebyshev measure.

13 The Legendre polynomials The Legendre polynomials P l are orthonormal on D = [ 1, 1] with respect to Lebesgue measure. They are generated via Gram-Schmidt orthogonalization on the monomial system 1, x, x 2,... They are used in schemes for solving several classical PDEs, such as Laplace s equation Fast Legendre polynomial transform: requires only O(N log N) arithmetic operations

14 4 The Legendre polynomials The Legendre polynomials satisfy P l = 2l + 1, so K = sup 0 l N 1 P l = 2N 1. From existing theory: s-sparse polynomials of the form f (t) = N k=0 x kp k (t) may be recovered from its values at m = O(sK 2 ln 4 (N)) = O(sN ln 4 N) sampling points. this is trivial!

15 The Legendre polynomials The Legendre polynomials satisfy P l = 2l + 1, so K = sup 0 l N 1 P l = 2N 1. From existing theory: s-sparse polynomials of the form f (t) = N k=0 x kp k (t) may be recovered from its values at m = O(sK 2 ln 4 (N)) = O(sN ln 4 N) sampling points. this is trivial!

16 Sparse recovery for Legendre orthonormal systems Theorem (Rauhut, W 10) Let N and s be given. Suppose that m = O(s log 4 N) sampling points t 1,..., t m are drawn from the Chebyshev measure dν = π 1 (1 t 2 ) 1/2 dt on [ 1, 1]. Then with probability exceeding 1 N log3 N, the following holds for all polynomials f (t) = N 1 k=0 x kp k (t): Suppose that noisy sample values (f (t 1 ) + η 1,..., f (t m ) + η m ) are observed, and η 2 mɛ. Then the coefficient vector x = (x 0, x 1,..., x N 1 ) is approximated by x := min z z 1 subject to Az y 2 ɛ up to error x x 2 C 1 x x s 1 s + C 2 ɛ.

17 17 Numerical Example A sparse Legendre polynomial with sparsity s = 5, maximal degree N = 80, and n = 20 i.i.d. sampling points from the Chebyshev measure. Reconstruction by l 1 -minimization is exact!

18 Numerical Example The same sparse Legendre polynomial (black) subject to noisy random measurements, and the stable approximation obtained by l 1 -minimization.

19 19 Key idea in proof: Premultiplication The Legendre polynomials grow uniformly as they approach the endpoints: P n (t) < 2π 1/2 (1 t 2 ) 1/4, 1 t 1; (Constant due to S. Bernstein)

20 20 Key idea in proof: Premultiplication The Legendre polynomials grow uniformly as they approach the endpoints: P n (t) < 2π 1/2 2π 1/2 (1 t 2 ) 1/4, 1 t 1; (Constant due to S. Bernstein) So the functions Q n (t) = π 2 (1 t2 ) 1/4 P n (t) form a uniformly bounded system (and Q n 2). Also, the Q n s are orthonormal with respect to the Chebyshev measure dν = π 1 (1 t 2 ) 1/2 dt on [ 1, 1]: 1 1 Q n (t)q m (t)π 1 (1 t 2 ) 1/2 dt = 1 1 P n (t)p m (t)dt = δ n,m.

21 Outline Sparse recovery for bounded orthonormal systems Sparse recovery for (certain) unbounded orthonormal systems From compressive sensing to function approximation Applications

22 22 Universality of the Chebyshev measure Consider a probability measure ν(t)dt on [ 1, 1]. (Szego): If ν(t) satisfies a mild continuity condition, then the polynomials p ν n that are orthonormal w.r.t. ν will still satisfy a uniform growth condition: p ν n(t) C ν (1 t 2 ) 1/4 ν(t) 1/2, as before, q ν n(t) = (1 t 2 ) 1/4 ν(t) 1/2 p ν n(t) is a uniformly bounded system and orthonormal w.r.t. Chebyshev measure (1 t 2 ) 1/2 dt.

23 From compressive sensing to function approximation Consider the weighted norm on continuous functions in [ 1, 1], f,w = and the error sup f (t) w(t), w(t) = (1 t 2 ) 1/4 ν(t), t [ 1,1] N 1 σ N,s (f ),w = inf{ f x k pk ν,w : (x k ) R N, x 0 s} k=0 Theorem (Rauhut and W, 10) Let N, m, s be given with m = O(s log 4 N). Then there exist sampling points t 1,..., t m (chosen according to Chebyshev distribution) and an efficient reconstruction procedure (l 1 minimization) such that for any continuous function f, the polynomial P of degree at most N 1 reconstructed from f (t 1 ),..., f (t m ) satisfies 23 f P,w C w s σn,s (f ),w.

24 From compressive sensing to function approximation Consider the weighted norm on continuous functions in [ 1, 1], f,w = and the error sup f (t) w(t), w(t) = (1 t 2 ) 1/4 ν(t), t [ 1,1] N 1 σ N,s (f ),w = inf{ f x k pk ν,w : (x k ) R N, x 0 s} k=0 Theorem (Rauhut and W, 10) Let N, m, s be given with m = O(s log 4 N). Then there exist sampling points t 1,..., t m (chosen according to Chebyshev distribution) and an efficient reconstruction procedure (l 1 minimization) such that for any continuous function f, the polynomial P of degree at most N 1 reconstructed from f (t 1 ),..., f (t m ) satisfies 24 f P,w C w s σn,s (f ),w.

25 Extension to spherical harmonics Re(Y 0 0 ), Re(Y 0 1 ), Re(Y 0 2 ). Re(Y 1 2 ), Re(Y 0 3 ), Re(Y 1 3 ). Y m l (φ, θ) = C l,mp m l ( cos(θ) ) e imφ, l m l, l = 0, 1,... P m l s are the associated Legendre functions. Fast Spherical harmonic transform: requires only O(N(log N) 2 ) arithmetic operations

26 26 Extension to spherical harmonics The associated Legendre functions Pl m may be written in terms of orthonormal polynomials with respect to the positive weights ν α (x) = (1 x 2 ) α, α = 0, 1,..., n. Krasikov 08: (1 x 2 ) 1/4 ν α (x) 1/2 p α n (x) C α O(α 1/4 ) from this one can derive... Corollary Let N, m, s be given with m = O(sN 1/4 log 4 N). Then there exist sampling points t 1,..., t m (chosen according to the spherical Chebyshev distribution) and an efficient reconstruction procedure (l 1 minimization) such that any function on the sphere which is an s-sparse expansion in the first N spherical harmonic basis functions may be exactly recovered from these sampling points.

27 Outline Sparse recovery for bounded orthonormal systems Sparse recovery for (certain) unbounded orthonormal systems From compressive sensing to function approximation Applications

28 8 Application: Cosmic Microwave Background Radiation (CMB) map WMAP full-sky temperature anisotropy map (Source: T (θ, φ) = l l=0 m= l a l,myl m (θ, φ) where the Y s are the spherical harmonics. Red band: measurements are corrupted, interference with galactic foreground signal.

29 29 CMB map is compressible in spherical harmonics Source: Lyman Page, a a s 2 / a 2, s = 1,..., n 2 Consider the coefficient vector a = (a l,m ) in T (θ, φ) n l l=0 m= l a l,myl m (θ, φ). This vector is predicted and observed to be compressible in the spherical harmonic basis.

30 CMB map is compressible in spherical harmonics (J. Starck et. al., 08): 1 Propose full-sky CMB map inpainting from partial CMB measurements T (θ j, φ k ) = m j,k. Obtain the coefficients a = (a l,m ) by solving the l 1 minimization problem, a = arg min N l l=0 m= l z l,m s.t. N l l=0 m= l z l,m Y l,m (θ j, φ k ) = m j,k 1 Source: P. Abrial, Y. Moudden, J. Starck, J. Fadili, J. Delabrouille, and M. Nguyen. CMB data analysis and sparsity. Stat. Methodol., 2008.

31 Final remarks Our results provide some theoretical justification for the good inpainting results obtained for the CMB map. Open questions: 1. Sparse recovery in spherical harmonic basis: can we get better recovery guarantees? 2. Can we generalize these results to functions sparse in bases that are eigenfunction solutions to a more general class of PDE THANK YOU

32 Final remarks Our results provide some theoretical justification for the good inpainting results obtained for the CMB map. Open questions: 1. Sparse recovery in spherical harmonic basis: can we get better recovery guarantees? 2. Can we generalize these results to functions sparse in bases that are eigenfunction solutions to a more general class of PDE THANK YOU

Sparse recovery for spherical harmonic expansions

Sparse recovery for spherical harmonic expansions Rachel Ward 1 1 Courant Institute, New York University Workshop Sparsity and Cosmology, Nice May 31, 2011 Cosmic Microwave Background Radiation (CMB) map Temperature is measured as T (θ, ϕ) = k k=0 l=

More information

Interpolation via weighted l 1 -minimization

Interpolation via weighted l 1 -minimization Interpolation via weighted l 1 -minimization Holger Rauhut RWTH Aachen University Lehrstuhl C für Mathematik (Analysis) Matheon Workshop Compressive Sensing and Its Applications TU Berlin, December 11,

More information

Interpolation via weighted l 1 -minimization

Interpolation via weighted l 1 -minimization Interpolation via weighted l 1 -minimization Holger Rauhut RWTH Aachen University Lehrstuhl C für Mathematik (Analysis) Mathematical Analysis and Applications Workshop in honor of Rupert Lasser Helmholtz

More information

Interpolation via weighted l 1 minimization

Interpolation via weighted l 1 minimization Interpolation via weighted l 1 minimization Rachel Ward University of Texas at Austin December 12, 2014 Joint work with Holger Rauhut (Aachen University) Function interpolation Given a function f : D C

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

AN INTRODUCTION TO COMPRESSIVE SENSING

AN INTRODUCTION TO COMPRESSIVE SENSING AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

Sparse solutions of underdetermined systems

Sparse solutions of underdetermined systems Sparse solutions of underdetermined systems I-Liang Chern September 22, 2016 1 / 16 Outline Sparsity and Compressibility: the concept for measuring sparsity and compressibility of data Minimum measurements

More information

Interpolation via weighted l 1 minimization

Interpolation via weighted l 1 minimization Interpolation via weighted l minimization Holger Rauhut, Rachel Ward August 3, 23 Abstract Functions of interest are often smooth and sparse in some sense, and both priors should be taken into account

More information

Interpolation-Based Trust-Region Methods for DFO

Interpolation-Based Trust-Region Methods for DFO Interpolation-Based Trust-Region Methods for DFO Luis Nunes Vicente University of Coimbra (joint work with A. Bandeira, A. R. Conn, S. Gratton, and K. Scheinberg) July 27, 2010 ICCOPT, Santiago http//www.mat.uc.pt/~lnv

More information

Greedy Signal Recovery and Uniform Uncertainty Principles

Greedy Signal Recovery and Uniform Uncertainty Principles Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

Constrained optimization

Constrained optimization Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained

More information

Constructing Explicit RIP Matrices and the Square-Root Bottleneck

Constructing Explicit RIP Matrices and the Square-Root Bottleneck Constructing Explicit RIP Matrices and the Square-Root Bottleneck Ryan Cinoman July 18, 2018 Ryan Cinoman Constructing Explicit RIP Matrices July 18, 2018 1 / 36 Outline 1 Introduction 2 Restricted Isometry

More information

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008

The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008 The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008 Emmanuel Candés (Caltech), Terence Tao (UCLA) 1 Uncertainty principles A basic principle

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

Sparse Recovery with Pre-Gaussian Random Matrices

Sparse Recovery with Pre-Gaussian Random Matrices Sparse Recovery with Pre-Gaussian Random Matrices Simon Foucart Laboratoire Jacques-Louis Lions Université Pierre et Marie Curie Paris, 75013, France Ming-Jun Lai Department of Mathematics University of

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

Lecture 3. Random Fourier measurements

Lecture 3. Random Fourier measurements Lecture 3. Random Fourier measurements 1 Sampling from Fourier matrices 2 Law of Large Numbers and its operator-valued versions 3 Frames. Rudelson s Selection Theorem Sampling from Fourier matrices Our

More information

Sparse and Low Rank Recovery via Null Space Properties

Sparse and Low Rank Recovery via Null Space Properties Sparse and Low Rank Recovery via Null Space Properties Holger Rauhut Lehrstuhl C für Mathematik (Analysis), RWTH Aachen Convexity, probability and discrete structures, a geometric viewpoint Marne-la-Vallée,

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,

More information

Combining geometry and combinatorics

Combining geometry and combinatorics Combining geometry and combinatorics A unified approach to sparse signal recovery Anna C. Gilbert University of Michigan joint work with R. Berinde (MIT), P. Indyk (MIT), H. Karloff (AT&T), M. Strauss

More information

Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs

Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs Mark Iwen Michigan State University October 18, 2014 Work with Janice (Xianfeng) Hu Graduating in May! M.A. Iwen (MSU) Fast

More information

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016 Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use

More information

Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery

Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery Anna C. Gilbert Department of Mathematics University of Michigan Sparse signal recovery measurements:

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Upper Bound for Intermediate Singular Values of Random Sub-Gaussian Matrices 1

Upper Bound for Intermediate Singular Values of Random Sub-Gaussian Matrices 1 Upper Bound for Intermediate Singular Values of Random Sub-Gaussian Matrices 1 Feng Wei 2 University of Michigan July 29, 2016 1 This presentation is based a project under the supervision of M. Rudelson.

More information

Random Coding for Fast Forward Modeling

Random Coding for Fast Forward Modeling Random Coding for Fast Forward Modeling Justin Romberg with William Mantzel, Salman Asif, Karim Sabra, Ramesh Neelamani Georgia Tech, School of ECE Workshop on Sparsity and Computation June 11, 2010 Bonn,

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Recall that any inner product space V has an associated norm defined by

Recall that any inner product space V has an associated norm defined by Hilbert Spaces Recall that any inner product space V has an associated norm defined by v = v v. Thus an inner product space can be viewed as a special kind of normed vector space. In particular every inner

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

6 Compressed Sensing and Sparse Recovery

6 Compressed Sensing and Sparse Recovery 6 Compressed Sensing and Sparse Recovery Most of us have noticed how saving an image in JPEG dramatically reduces the space it occupies in our hard drives as oppose to file types that save the pixel value

More information

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell

More information

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal

More information

Geometry of log-concave Ensembles of random matrices

Geometry of log-concave Ensembles of random matrices Geometry of log-concave Ensembles of random matrices Nicole Tomczak-Jaegermann Joint work with Radosław Adamczak, Rafał Latała, Alexander Litvak, Alain Pajor Cortona, June 2011 Nicole Tomczak-Jaegermann

More information

arxiv: v3 [math.na] 7 Dec 2018

arxiv: v3 [math.na] 7 Dec 2018 A data-driven framework for sparsity-enhanced surrogates with arbitrary mutually dependent randomness Huan Lei, 1, Jing Li, 1, Peiyuan Gao, 1 Panos Stinis, 1, 2 1, 3, and Nathan A. Baker 1 Pacific Northwest

More information

Compressed Sensing and Redundant Dictionaries

Compressed Sensing and Redundant Dictionaries Compressed Sensing and Redundant Dictionaries Holger Rauhut, Karin Schnass and Pierre Vandergheynst Abstract This article extends the concept of compressed sensing to signals that are not sparse in an

More information

Signal Recovery from Permuted Observations

Signal Recovery from Permuted Observations EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,

More information

Limitations in Approximating RIP

Limitations in Approximating RIP Alok Puranik Mentor: Adrian Vladu Fifth Annual PRIMES Conference, 2015 Outline 1 Background The Problem Motivation Construction Certification 2 Planted model Planting eigenvalues Analysis Distinguishing

More information

On the coherence barrier and analogue problems in compressed sensing

On the coherence barrier and analogue problems in compressed sensing On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

Thresholds for the Recovery of Sparse Solutions via L1 Minimization

Thresholds for the Recovery of Sparse Solutions via L1 Minimization Thresholds for the Recovery of Sparse Solutions via L Minimization David L. Donoho Department of Statistics Stanford University 39 Serra Mall, Sequoia Hall Stanford, CA 9435-465 Email: donoho@stanford.edu

More information

Extracting structured dynamical systems using sparse optimization with very few samples

Extracting structured dynamical systems using sparse optimization with very few samples Extracting structured dynamical systems using sparse optimization with very few samples Hayden Schaeffer 1, Giang Tran 2, Rachel Ward 3 and Linan Zhang 1 1 Department of Mathematical Sciences, Carnegie

More information

Compressed Sensing and Redundant Dictionaries

Compressed Sensing and Redundant Dictionaries Compressed Sensing and Redundant Dictionaries Holger Rauhut, Karin Schnass and Pierre Vandergheynst December 2, 2006 Abstract This article extends the concept of compressed sensing to signals that are

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Toeplitz Compressed Sensing Matrices with. Applications to Sparse Channel Estimation

Toeplitz Compressed Sensing Matrices with. Applications to Sparse Channel Estimation Toeplitz Compressed Sensing Matrices with 1 Applications to Sparse Channel Estimation Jarvis Haupt, Waheed U. Bajwa, Gil Raz, and Robert Nowak Abstract Compressed sensing CS has recently emerged as a powerful

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).

More information

Sparse Optimization Lecture: Sparse Recovery Guarantees

Sparse Optimization Lecture: Sparse Recovery Guarantees Those who complete this lecture will know Sparse Optimization Lecture: Sparse Recovery Guarantees Sparse Optimization Lecture: Sparse Recovery Guarantees Instructor: Wotao Yin Department of Mathematics,

More information

ROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210

ROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210 ROBUST BLIND SPIKES DECONVOLUTION Yuejie Chi Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 4 ABSTRACT Blind spikes deconvolution, or blind super-resolution, deals with

More information

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan Connection between... Sparse Approximation and Compressed

More information

Solution Recovery via L1 minimization: What are possible and Why?

Solution Recovery via L1 minimization: What are possible and Why? Solution Recovery via L1 minimization: What are possible and Why? Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Eighth US-Mexico Workshop on Optimization

More information

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad

More information

An algebraic perspective on integer sparse recovery

An algebraic perspective on integer sparse recovery An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer

More information

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

A Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia

A Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia A Tutorial on Compressive Sensing Simon Foucart Drexel University / University of Georgia CIMPA13 New Trends in Applied Harmonic Analysis Mar del Plata, Argentina, 5-16 August 2013 This minicourse acts

More information

COMPRESSED SENSING IN PYTHON

COMPRESSED SENSING IN PYTHON COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed

More information

(Part 1) High-dimensional statistics May / 41

(Part 1) High-dimensional statistics May / 41 Theory for the Lasso Recall the linear model Y i = p j=1 β j X (j) i + ɛ i, i = 1,..., n, or, in matrix notation, Y = Xβ + ɛ, To simplify, we assume that the design X is fixed, and that ɛ is N (0, σ 2

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 57 Table of Contents 1 Sparse linear models Basis Pursuit and restricted null space property Sufficient conditions for RNS 2 / 57

More information

Algorithms for sparse analysis Lecture I: Background on sparse approximation

Algorithms for sparse analysis Lecture I: Background on sparse approximation Algorithms for sparse analysis Lecture I: Background on sparse approximation Anna C. Gilbert Department of Mathematics University of Michigan Tutorial on sparse approximations and algorithms Compress data

More information

Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements

Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements Wolfgang Dahmen Institut für Geometrie und Praktische Mathematik RWTH Aachen and IMI, University of Columbia, SC

More information

MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing

MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing Afonso S. Bandeira April 9, 2015 1 The Johnson-Lindenstrauss Lemma Suppose one has n points, X = {x 1,..., x n }, in R d with d very

More information

Z Algorithmic Superpower Randomization October 15th, Lecture 12

Z Algorithmic Superpower Randomization October 15th, Lecture 12 15.859-Z Algorithmic Superpower Randomization October 15th, 014 Lecture 1 Lecturer: Bernhard Haeupler Scribe: Goran Žužić Today s lecture is about finding sparse solutions to linear systems. The problem

More information

Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit

Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 6-5-2008 Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit

More information

Sparse Astronomical Data Analysis. Jean-Luc Starck. Collaborators: J. Bobin., F. Sureau. CEA Saclay

Sparse Astronomical Data Analysis. Jean-Luc Starck. Collaborators: J. Bobin., F. Sureau. CEA Saclay Sparse Astronomical Data Analysis Jean-Luc Starck Collaborators: J. Bobin., F. Sureau CEA Saclay What is a good representation for data? A signal s (n samples) can be represented as sum of weighted elements

More information

Signal Recovery, Uncertainty Relations, and Minkowski Dimension

Signal Recovery, Uncertainty Relations, and Minkowski Dimension Signal Recovery, Uncertainty Relations, and Minkowski Dimension Helmut Bőlcskei ETH Zurich December 2013 Joint work with C. Aubel, P. Kuppinger, G. Pope, E. Riegler, D. Stotz, and C. Studer Aim of this

More information

Compressive Sensing Theory and L1-Related Optimization Algorithms

Compressive Sensing Theory and L1-Related Optimization Algorithms Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:

More information

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space. Chapter 1 Preliminaries The purpose of this chapter is to provide some basic background information. Linear Space Hilbert Space Basic Principles 1 2 Preliminaries Linear Space The notion of linear space

More information

Beyond incoherence and beyond sparsity: compressed sensing in the real world

Beyond incoherence and beyond sparsity: compressed sensing in the real world Beyond incoherence and beyond sparsity: compressed sensing in the real world Clarice Poon 1st November 2013 University of Cambridge, UK Applied Functional and Harmonic Analysis Group Head of Group Anders

More information

The Sparsest Solution of Underdetermined Linear System by l q minimization for 0 < q 1

The Sparsest Solution of Underdetermined Linear System by l q minimization for 0 < q 1 The Sparsest Solution of Underdetermined Linear System by l q minimization for 0 < q 1 Simon Foucart Department of Mathematics Vanderbilt University Nashville, TN 3784. Ming-Jun Lai Department of Mathematics,

More information

Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing

Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing ibeltal F. Alem, Daniel H. Chae, and Rodney A. Kennedy The Australian National

More information

Reconstruction of sparse Legendre and Gegenbauer expansions

Reconstruction of sparse Legendre and Gegenbauer expansions Reconstruction of sparse Legendre and Gegenbauer expansions Daniel Potts Manfred Tasche We present a new deterministic algorithm for the reconstruction of sparse Legendre expansions from a small number

More information

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION A Thesis by MELTEM APAYDIN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the

More information

Optimization-based sparse recovery: Compressed sensing vs. super-resolution

Optimization-based sparse recovery: Compressed sensing vs. super-resolution Optimization-based sparse recovery: Compressed sensing vs. super-resolution Carlos Fernandez-Granda, Google Computational Photography and Intelligent Cameras, IPAM 2/5/2014 This work was supported by a

More information

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Bobak Nazer and Robert D. Nowak University of Wisconsin, Madison Allerton 10/01/10 Motivation: Virus-Host Interaction

More information

Solution-recovery in l 1 -norm for non-square linear systems: deterministic conditions and open questions

Solution-recovery in l 1 -norm for non-square linear systems: deterministic conditions and open questions Solution-recovery in l 1 -norm for non-square linear systems: deterministic conditions and open questions Yin Zhang Technical Report TR05-06 Department of Computational and Applied Mathematics Rice University,

More information

Stability and robustness of l 1 -minimizations with Weibull matrices and redundant dictionaries

Stability and robustness of l 1 -minimizations with Weibull matrices and redundant dictionaries Stability and robustness of l 1 -minimizations with Weibull matrices and redundant dictionaries Simon Foucart, Drexel University Abstract We investigate the recovery of almost s-sparse vectors x C N from

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

COMPRESSED Sensing (CS) is a method to recover a

COMPRESSED Sensing (CS) is a method to recover a 1 Sample Complexity of Total Variation Minimization Sajad Daei, Farzan Haddadi, Arash Amini Abstract This work considers the use of Total Variation (TV) minimization in the recovery of a given gradient

More information

An iterative hard thresholding estimator for low rank matrix recovery

An iterative hard thresholding estimator for low rank matrix recovery An iterative hard thresholding estimator for low rank matrix recovery Alexandra Carpentier - based on a joint work with Arlene K.Y. Kim Statistical Laboratory, Department of Pure Mathematics and Mathematical

More information

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Noisy Signal Recovery via Iterative Reweighted L1-Minimization Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.

More information

Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications

Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications Program in Applied and Computational Mathematics Princeton University NJ 08544, USA. Introduction What is Compressive

More information

Tutorial: Sparse Signal Recovery

Tutorial: Sparse Signal Recovery Tutorial: Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan (Sparse) Signal recovery problem signal or population length N k important Φ x = y measurements or tests:

More information

Uncertainty quantification for sparse solutions of random PDEs

Uncertainty quantification for sparse solutions of random PDEs Uncertainty quantification for sparse solutions of random PDEs L. Mathelin 1 K.A. Gallivan 2 1 LIMSI - CNRS Orsay, France 2 Mathematics Dpt., Florida State University Tallahassee, FL, USA SIAM 10 July

More information

A class of non-convex polytopes that admit no orthonormal basis of exponentials

A class of non-convex polytopes that admit no orthonormal basis of exponentials A class of non-convex polytopes that admit no orthonormal basis of exponentials Mihail N. Kolountzakis and Michael Papadimitrakis 1 Abstract A conjecture of Fuglede states that a bounded measurable set

More information

III. Quantum ergodicity on graphs, perspectives

III. Quantum ergodicity on graphs, perspectives III. Quantum ergodicity on graphs, perspectives Nalini Anantharaman Université de Strasbourg 24 août 2016 Yesterday we focussed on the case of large regular (discrete) graphs. Let G = (V, E) be a (q +

More information

Random regular digraphs: singularity and spectrum

Random regular digraphs: singularity and spectrum Random regular digraphs: singularity and spectrum Nick Cook, UCLA Probability Seminar, Stanford University November 2, 2015 Universality Circular law Singularity probability Talk outline 1 Universality

More information