Estimating Unknown Sparsity in Compressed Sensing

Size: px
Start display at page:

Download "Estimating Unknown Sparsity in Compressed Sensing"

Transcription

1 Estimating Unknown Sparsity in Compressed Sensing Miles Lopes UC Berkeley Department of Statistics CSGF Program Review July 16, 2014 early version published at ICML 2013 Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

2 Overview of compressed sensing (CS) Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

3 Traditional A/D conversion is wasteful (graphic by Mishali and Eldar, 2011) Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

4 Traditional A/D conversion is wasteful (graphic by Mishali and Eldar, 2011) Compressed sensing: Do acquisition and compression in one step! Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

5 Natural signals are compressible We should acquire the relevant 25k numbers, rather than acquire 1M numbers. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

6 A real example of compressed sensing single pixel camera, Wakin et al., 2006 Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

7 A real example of compressed sensing single pixel camera, Wakin et al., 2006 x = scene (signal vector in R p ) a i = grid of mirrors (binary vector in R p, i = 1,..., n) y i = a i, x + ɛ i, (diode output / linear measurement) Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

8 Mathematical setup Compressed sensing is a framework for estimating a sparse vector from a small number of linear measurements, In matrix notation, if a i y i = a i, x + ɛ i, i = 1,..., n. is ith row of A R n p then, y = Ax + ɛ. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

9 Mathematical setup Compressed sensing is a framework for estimating a sparse vector from a small number of linear measurements, In matrix notation, if a i y i = a i, x + ɛ i, i = 1,..., n. is ith row of A R n p then, y = Ax + ɛ. Goal: Design a matrix A and use noisy measurements y to accurately recover x. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

10 Mathematical setup Compressed sensing is a framework for estimating a sparse vector from a small number of linear measurements, In matrix notation, if a i y i = a i, x + ɛ i, i = 1,..., n. is ith row of A R n p then, y = Ax + ɛ. Goal: Design a matrix A and use noisy measurements y to accurately recover x. Obstacle: Dimension of x R p is large, and number of measurements is small: n p. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

11 Mathematical setup Compressed sensing is a framework for estimating a sparse vector from a small number of linear measurements, In matrix notation, if a i y i = a i, x + ɛ i, i = 1,..., n. is ith row of A R n p then, y = Ax + ɛ. Goal: Design a matrix A and use noisy measurements y to accurately recover x. Obstacle: Dimension of x R p is large, and number of measurements is small: n p. Structure: The signal x is sparse in some known basis (e.g. wavelet) Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

12 Mathematical setup Compressed sensing is a framework for estimating a sparse vector from a small number of linear measurements, In matrix notation, if a i y i = a i, x + ɛ i, i = 1,..., n. is ith row of A R n p then, y = Ax + ɛ. Goal: Design a matrix A and use noisy measurements y to accurately recover x. Obstacle: Dimension of x R p is large, and number of measurements is small: n p. Structure: The signal x is sparse in some known basis (e.g. wavelet) Note: The matrix A is singular; short and wide If x is sparse in an orthonormal basis U R p p, this can be ignored abstractly by choosing A := ÃU, giving observations y = Ã(Ux) + ɛ. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

13 How do we recover x from compressed measurements? Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

14 The fundamental theorem of compressed sensing Suppose we have y = Ax + ɛ with noise bounded by ɛ 2 ɛ 0. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

15 The fundamental theorem of compressed sensing Suppose we have y = Ax + ɛ with noise bounded by ɛ 2 ɛ 0. Let x be a solution to the convex optimization problem minimize v 1 subject to Av y 2 ɛ 0, v R p. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

16 The fundamental theorem of compressed sensing Suppose we have y = Ax + ɛ with noise bounded by ɛ 2 ɛ 0. Let x be a solution to the convex optimization problem minimize v 1 subject to Av y 2 ɛ 0, v R p. Define the sparsity x 0 := #{j : x j 0}. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

17 The fundamental theorem of compressed sensing Suppose we have y = Ax + ɛ with noise bounded by ɛ 2 ɛ 0. Let x be a solution to the convex optimization problem minimize v 1 subject to Av y 2 ɛ 0, v R p. Define the sparsity x 0 := #{j : x j 0}. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

18 The fundamental theorem of compressed sensing Suppose we have y = Ax + ɛ with noise bounded by ɛ 2 ɛ 0. Let x be a solution to the convex optimization problem minimize v 1 subject to Av y 2 ɛ 0, v R p. Define the sparsity x 0 := #{j : x j 0}. Theorem If A R n p is randomly drawn from a suitable ensemble, and ( ) n x 0 log p x 0, then with high probability x x 2 ɛ 0. cf. Candès, Romberg, and Tao, 06, Donoho 06 Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

19 The issue of unknown sparsity Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

20 unknown sparsity Many aspects of CS depend on x 0 = #{j : x j 0}, but this number is typically unknown. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

21 unknown sparsity Many aspects of CS depend on x 0 = #{j : x j 0}, but this number is typically unknown. How do we know if x is sparse at all (in a given basis)? Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

22 unknown sparsity Many aspects of CS depend on x 0 = #{j : x j 0}, but this number is typically unknown. How do we know if x is sparse at all (in a given basis)? How do we know if we have taken enough measurements?, i.e. n x 0 log(p/ x 0 )? Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

23 unknown sparsity Many aspects of CS depend on x 0 = #{j : x j 0}, but this number is typically unknown. How do we know if x is sparse at all (in a given basis)? How do we know if we have taken enough measurements?, i.e. n x 0 log(p/ x 0 )? If we knew x 0, we could take just enough measurements. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

24 unknown sparsity Many aspects of CS depend on x 0 = #{j : x j 0}, but this number is typically unknown. How do we know if x is sparse at all (in a given basis)? How do we know if we have taken enough measurements?, i.e. n x 0 log(p/ x 0 )? If we knew x 0, we could take just enough measurements. Realistic signals x R p do not have coordinates exactly equal to 0. In this case, the value x 0 = p is not descriptive of x. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

25 unknown sparsity Many aspects of CS depend on x 0 = #{j : x j 0}, but this number is typically unknown. How do we know if x is sparse at all (in a given basis)? How do we know if we have taken enough measurements?, i.e. n x 0 log(p/ x 0 )? If we knew x 0, we could take just enough measurements. Realistic signals x R p do not have coordinates exactly equal to 0. In this case, the value x 0 = p is not descriptive of x. = Even if we could estimate x 0 perfectly, it wouldn t help in practice. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

26 unknown sparsity Many aspects of CS depend on x 0 = #{j : x j 0}, but this number is typically unknown. How do we know if x is sparse at all (in a given basis)? How do we know if we have taken enough measurements?, i.e. n x 0 log(p/ x 0 )? If we knew x 0, we could take just enough measurements. Realistic signals x R p do not have coordinates exactly equal to 0. In this case, the value x 0 = p is not descriptive of x. = Even if we could estimate x 0 perfectly, it wouldn t help in practice. = x 0 is the wrong measure of sparsity to estimate We want to estimate a more realistic measure of sparsity! Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

27 An alternative view of sparsity Idea: For any non-zero vector in x R p, we can put a distribution on {1,..., p}. Consider the probability vector (π 1 (x),..., π p (x)) given by π j (x) = xj x 1. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

28 An alternative view of sparsity Idea: For any non-zero vector in x R p, we can put a distribution on {1,..., p}. Consider the probability vector (π 1 (x),..., π p (x)) given by π j (x) = xj x 1. Suppose we draw a random index J π(x). J is most likely to fall on the indices where x j is large. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

29 An alternative view of sparsity Idea: For any non-zero vector in x R p, we can put a distribution on {1,..., p}. Consider the probability vector (π 1 (x),..., π p (x)) given by π j (x) = xj x 1. Suppose we draw a random index J π(x). J is most likely to fall on the indices where x j is large. effective states of J = effective coordinates of x Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

30 Connecting entropy and sparsity Fact: If H(J) is the Shannon entropy of J, then # of effective states of J exp(h(j)). Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

31 Connecting entropy and sparsity Fact: If H(J) is the Shannon entropy of J, then # of effective states of J exp(h(j)). A more general measure of entropy is the Rényi entropy H q (J) = 1 1 q log p i=1 πq i, where π i = P(J = i) = xi x 1, and q [0, ]. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

32 Connecting entropy and sparsity Fact: If H(J) is the Shannon entropy of J, then # of effective states of J exp(h(j)). A more general measure of entropy is the Rényi entropy H q (J) = 1 1 q log p i=1 πq i, where π i = P(J = i) = xi x 1, and q [0, ]. Recall x q q = i x i q. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

33 Connecting entropy and sparsity Fact: If H(J) is the Shannon entropy of J, then # of effective states of J exp(h(j)). A more general measure of entropy is the Rényi entropy H q (J) = 1 1 q log p i=1 πq i, where π i = P(J = i) = xi x 1, and q [0, ]. Recall x q q = i x i q. We now define the numerical sparsity: ( ) q x q 1 q s q (x) := exp(h q (J)) =. x 1 It can be checked that as q 0, we obtain s q (x) x 0. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

34 s 2 (x) = x 2 1 x 2 2 = effective number of coordinates vectors in R 100 and s 2 (x) values coordinate value s 2 (x) = 16.4, x 0 = 100 s 2 (x) = 32.7, x 0 = 45 s 2 (x) = 66.6, x 0 = coordinate index Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

35 s 2 (x) = x 2 1 x 2 2 = effective number of coordinates x sub-level set s 2 (x) 1.1 in R x 1 x sub-level set s 2 (x) 1.9 in R x 1 Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

36 ( ) q x q 1 q Properties of s q (x) := x 1. 1 Continuous function of x away from origin whenever q > 0. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

37 ( ) q x q 1 q Properties of s q (x) := x 1. 1 Continuous function of x away from origin whenever q > 0. 2 For all non-zero x, we have 1 s q (x) p. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

38 ( ) q x q 1 q Properties of s q (x) := x 1. 1 Continuous function of x away from origin whenever q > 0. 2 For all non-zero x, we have 1 s q (x) p. 3 When q 0, it matches hard sparsity, s q (x) x 0. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

39 ( ) q x q 1 q Properties of s q (x) := x 1. 1 Continuous function of x away from origin whenever q > 0. 2 For all non-zero x, we have 1 s q (x) p. 3 When q 0, it matches hard sparsity, s q (x) x 0. 4 Scale-invariance, s q (x) = s q (cx) Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

40 ( ) q x q 1 q Properties of s q (x) := x 1. 1 Continuous function of x away from origin whenever q > 0. 2 For all non-zero x, we have 1 s q (x) p. 3 When q 0, it matches hard sparsity, s q (x) x 0. 4 Scale-invariance, s q (x) = s q (cx) 5 General lower bound for all non-zero x, s q (x) x 0 Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

41 How does s 2 (x) relate to signal recovery? Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

42 A link between s 2 (x) and signal recovery Theorem (L., 2014) Suppose the compressed sensing model holds with x 0, n p, and ɛ 2 ɛ 0. Also assume the matrix A R n p is randomly drawn from a suitable ensemble. Let x argmin{ v 1 : Av y 2 ɛ 0, v R p }. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

43 A link between s 2 (x) and signal recovery Theorem (L., 2014) Suppose the compressed sensing model holds with x 0, n p, and ɛ 2 ɛ 0. Also assume the matrix A R n p is randomly drawn from a suitable ensemble. Let x argmin{ v 1 : Av y 2 ɛ 0, v R p }. Then, there are absolute constants c 1, c 2, c 3 > 0 such that for any n 1, the event x x 2 ɛ x 2 c 0 s 1 x 2 + c 2(x) pe 2 n log( n ) occurs with probability at least 1 2 exp( c 3 n). Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

44 A link between s 2 (x) and signal recovery Theorem (L., 2014) Suppose the compressed sensing model holds with x 0, n p, and ɛ 2 ɛ 0. Also assume the matrix A R n p is randomly drawn from a suitable ensemble. Let x argmin{ v 1 : Av y 2 ɛ 0, v R p }. Then, there are absolute constants c 1, c 2, c 3 > 0 such that for any n 1, the event x x 2 ɛ x 2 c 0 s 1 x 2 + c 2(x) pe 2 n log( n ) occurs with probability at least 1 2 exp( c 3 n). essential points: s 2(x) offers a meaningful way of measuring sample complexity. This result applies to any non-zero x (hard sparsity is not required). Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

45 How do we estimate s q (x)? Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

46 How do we estimate s q (x)? Recall the relation ( ) q x q 1 q s q (x) =. x 1 = It s enough just to estimate x q for general q. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

47 Estimation scheme for q = 2 and generalizations Let x R p be a fixed signal. The case q = 2. If a i R p has i.i.d. N(0, 1) coordinates, then a i, x x 2 N(0, 1) Hence, if we had noiseless measurements y i = a i, x, then y 1,..., y n, is a sample from N(0, x 2 2). Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

48 Estimation scheme for q = 2 and generalizations Let x R p be a fixed signal. The case q = 2. If a i R p has i.i.d. N(0, 1) coordinates, then a i, x x 2 N(0, 1) Hence, if we had noiseless measurements y i = a i, x, then y 1,..., y n, is a sample from N(0, x 2 2). idea: Estimating x 2 2 amounts to estimating the scale parameter (variance) Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

49 Estimation scheme for q = 2 and generalizations Let x R p be a fixed signal. The case q = 2. If a i R p has i.i.d. N(0, 1) coordinates, then a i, x x 2 N(0, 1) Hence, if we had noiseless measurements y i = a i, x, then y 1,..., y n, is a sample from N(0, x 2 2). idea: Estimating x 2 2 amounts to estimating the scale parameter (variance) Note: This doesn t depend on dimension or sparsity of x only the norm. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

50 Estimation scheme for q = 2 and generalizations Let x R p be a fixed signal. The case q = 2. If a i R p has i.i.d. N(0, 1) coordinates, then a i, x x 2 N(0, 1) Hence, if we had noiseless measurements y i = a i, x, then y 1,..., y n, is a sample from N(0, x 2 2). idea: Estimating x 2 2 amounts to estimating the scale parameter (variance) Note: This doesn t depend on dimension or sparsity of x only the norm. The general case q (0, 2]. If a i R p has i.i.d. coordinates drawn from a stable q (0, 1) distribution then, a i, x x q stable q (0, 1) = estimate x q by viewing it as a scale parameter of an i.i.d. sample! Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

51 Summary Derived s q (x) as a generalization of x 0 Showed that s q (x) measures sample complexity for recovery of x. Derived an estimator for s q (x) that does not rely on sparsity assumptions and is dimension free Proved a CLT for the estimator and obtained exact formulas for asymptotic variance = confidence intervals/hypothesis tests related to sparsity Validated asymptotic calculations with simulations. Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

52 Thank you for your attention. Special thanks to: Thesis advisor, Peter Bickel at UC Berkeley Practicum Advisor, Philip Kegelmeyer at Sandia Livermore CSGF fellowship, grant DE-FG02-97ER25308 Everyone at the Krell Institute for making the fellowship possible Miles Lopes ( UC Berkeley ) estimating unknown sparsity July 16, / 21

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional

More information

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Noisy Signal Recovery via Iterative Reweighted L1-Minimization Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.

More information

Compressed Sensing and Related Learning Problems

Compressed Sensing and Related Learning Problems Compressed Sensing and Related Learning Problems Yingzhen Li Dept. of Mathematics, Sun Yat-sen University Advisor: Prof. Haizhang Zhang Advisor: Prof. Haizhang Zhang 1 / Overview Overview Background Compressed

More information

Compressed Sensing Using Bernoulli Measurement Matrices

Compressed Sensing Using Bernoulli Measurement Matrices ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada Motivation Motivation

More information

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Jeevan K. Pant, Wu-Sheng Lu, and Andreas Antoniou University of Victoria May 21, 2012 Compressive Sensing 1/23

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

IEOR 265 Lecture 3 Sparse Linear Regression

IEOR 265 Lecture 3 Sparse Linear Regression IOR 65 Lecture 3 Sparse Linear Regression 1 M Bound Recall from last lecture that the reason we are interested in complexity measures of sets is because of the following result, which is known as the M

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

Spin Glass Approach to Restricted Isometry Constant

Spin Glass Approach to Restricted Isometry Constant Spin Glass Approach to Restricted Isometry Constant Ayaka Sakata 1,Yoshiyuki Kabashima 2 1 Institute of Statistical Mathematics 2 Tokyo Institute of Technology 1/29 Outline Background: Compressed sensing

More information

Mathematical introduction to Compressed Sensing

Mathematical introduction to Compressed Sensing Mathematical introduction to Compressed Sensing Lesson 1 : measurements and sparsity Guillaume Lecué ENSAE Mardi 31 janvier 2016 Guillaume Lecué (ENSAE) Compressed Sensing Mardi 31 janvier 2016 1 / 31

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Stable Signal Recovery from Incomplete and Inaccurate Measurements

Stable Signal Recovery from Incomplete and Inaccurate Measurements Stable Signal Recovery from Incomplete and Inaccurate Measurements EMMANUEL J. CANDÈS California Institute of Technology JUSTIN K. ROMBERG California Institute of Technology AND TERENCE TAO University

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

A Generalized Restricted Isometry Property

A Generalized Restricted Isometry Property 1 A Generalized Restricted Isometry Property Jarvis Haupt and Robert Nowak Department of Electrical and Computer Engineering, University of Wisconsin Madison University of Wisconsin Technical Report ECE-07-1

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Bayesian Methods for Sparse Signal Recovery

Bayesian Methods for Sparse Signal Recovery Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery

More information

Compressed Sensing and Robust Recovery of Low Rank Matrices

Compressed Sensing and Robust Recovery of Low Rank Matrices Compressed Sensing and Robust Recovery of Low Rank Matrices M. Fazel, E. Candès, B. Recht, P. Parrilo Electrical Engineering, University of Washington Applied and Computational Mathematics Dept., Caltech

More information

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016 Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

COMPRESSED SENSING IN PYTHON

COMPRESSED SENSING IN PYTHON COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed

More information

arxiv: v1 [cs.it] 21 Feb 2013

arxiv: v1 [cs.it] 21 Feb 2013 q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto

More information

Tractable Upper Bounds on the Restricted Isometry Constant

Tractable Upper Bounds on the Restricted Isometry Constant Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices Compressed Sensing Yonina C. Eldar Electrical Engineering Department, Technion-Israel Institute of Technology, Haifa, Israel, 32000 1 Introduction Compressed sensing (CS) is an exciting, rapidly growing

More information

AN INTRODUCTION TO COMPRESSIVE SENSING

AN INTRODUCTION TO COMPRESSIVE SENSING AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

sparse and low-rank tensor recovery Cubic-Sketching

sparse and low-rank tensor recovery Cubic-Sketching Sparse and Low-Ran Tensor Recovery via Cubic-Setching Guang Cheng Department of Statistics Purdue University www.science.purdue.edu/bigdata CCAM@Purdue Math Oct. 27, 2017 Joint wor with Botao Hao and Anru

More information

Compressed sensing techniques for hyperspectral image recovery

Compressed sensing techniques for hyperspectral image recovery Compressed sensing techniques for hyperspectral image recovery A. Abrardo, M. Barni, C. M. Carretti, E. Magli, S. Kuiteing Kamdem, R. Vitulli ABSTRACT Compressed Sensing (CS) theory is progressively gaining

More information

Estimating Unknown Sparsity in Compressed Sensing

Estimating Unknown Sparsity in Compressed Sensing Miles E. Lopes UC Berkeley, Dept. Statistics, 367 Evans Hall, Berkeley, CA 947-386 mlopes@stat.berkeley.edu Abstract In the theory of compressed sensing (CS), the sparsity x of the unknown signal x R p

More information

Greedy Signal Recovery and Uniform Uncertainty Principles

Greedy Signal Recovery and Uniform Uncertainty Principles Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles

More information

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University

More information

Optimization-based sparse recovery: Compressed sensing vs. super-resolution

Optimization-based sparse recovery: Compressed sensing vs. super-resolution Optimization-based sparse recovery: Compressed sensing vs. super-resolution Carlos Fernandez-Granda, Google Computational Photography and Intelligent Cameras, IPAM 2/5/2014 This work was supported by a

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5 CS 229r: Algorithms for Big Data Fall 215 Prof. Jelani Nelson Lecture 19 Nov 5 Scribe: Abdul Wasay 1 Overview In the last lecture, we started discussing the problem of compressed sensing where we are given

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal

More information

Information Recovery from Pairwise Measurements

Information Recovery from Pairwise Measurements Information Recovery from Pairwise Measurements A Shannon-Theoretic Approach Yuxin Chen, Changho Suh, Andrea Goldsmith Stanford University KAIST Page 1 Recovering data from correlation measurements A large

More information

A Survey of Compressive Sensing and Applications

A Survey of Compressive Sensing and Applications A Survey of Compressive Sensing and Applications Justin Romberg Georgia Tech, School of ECE ENS Winter School January 10, 2012 Lyon, France Signal processing trends DSP: sample first, ask questions later

More information

Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice

Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice Jason N. Laska, Mark A. Davenport, Richard G. Baraniuk Department of Electrical and Computer Engineering Rice University

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method CS 395T: Sublinear Algorithms Fall 2016 Prof. Eric Price Lecture 13 October 6, 2016 Scribe: Kiyeon Jeon and Loc Hoang 1 Overview In the last lecture we covered the lower bound for p th moment (p > 2) and

More information

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes Item Type text; Proceedings Authors Jagiello, Kristin M. Publisher International Foundation for Telemetering Journal International Telemetering

More information

Sparse Solutions of an Undetermined Linear System

Sparse Solutions of an Undetermined Linear System 1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research

More information

Near Optimal Signal Recovery from Random Projections

Near Optimal Signal Recovery from Random Projections 1 Near Optimal Signal Recovery from Random Projections Emmanuel Candès, California Institute of Technology Multiscale Geometric Analysis in High Dimensions: Workshop # 2 IPAM, UCLA, October 2004 Collaborators:

More information

Performance Analysis for Sparse Support Recovery

Performance Analysis for Sparse Support Recovery Performance Analysis for Sparse Support Recovery Gongguo Tang and Arye Nehorai ESE, Washington University April 21st 2009 Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support

More information

An Overview of Compressed Sensing

An Overview of Compressed Sensing An Overview of Compressed Sensing Nathan Schneider November 18, 2009 Abstract In a large number of applications, the system will be designed to sample at a rate equal to at least the frequency bandwidth

More information

Sparse Parameter Estimation: Compressed Sensing meets Matrix Pencil

Sparse Parameter Estimation: Compressed Sensing meets Matrix Pencil Sparse Parameter Estimation: Compressed Sensing meets Matrix Pencil Yuejie Chi Departments of ECE and BMI The Ohio State University Colorado School of Mines December 9, 24 Page Acknowledgement Joint work

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Introduction to compressive sampling

Introduction to compressive sampling Introduction to compressive sampling Sparsity and the equation Ax = y Emanuele Grossi DAEIMI, Università degli Studi di Cassino e-mail: e.grossi@unicas.it Gorini 2010, Pistoia Outline 1 Introduction Traditional

More information

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).

More information

Approximate Message Passing Algorithms

Approximate Message Passing Algorithms November 4, 2017 Outline AMP (Donoho et al., 2009, 2010a) Motivations Derivations from a message-passing perspective Limitations Extensions Generalized Approximate Message Passing (GAMP) (Rangan, 2011)

More information

High-dimensional statistics: Some progress and challenges ahead

High-dimensional statistics: Some progress and challenges ahead High-dimensional statistics: Some progress and challenges ahead Martin Wainwright UC Berkeley Departments of Statistics, and EECS University College, London Master Class: Lecture Joint work with: Alekh

More information

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28 Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:

More information

Thresholds for the Recovery of Sparse Solutions via L1 Minimization

Thresholds for the Recovery of Sparse Solutions via L1 Minimization Thresholds for the Recovery of Sparse Solutions via L Minimization David L. Donoho Department of Statistics Stanford University 39 Serra Mall, Sequoia Hall Stanford, CA 9435-465 Email: donoho@stanford.edu

More information

CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT

CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT Sparse Approximations Goal: approximate a highdimensional vector x by x that is sparse, i.e., has few nonzero

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

New Applications of Sparse Methods in Physics. Ra Inta, Centre for Gravitational Physics, The Australian National University

New Applications of Sparse Methods in Physics. Ra Inta, Centre for Gravitational Physics, The Australian National University New Applications of Sparse Methods in Physics Ra Inta, Centre for Gravitational Physics, The Australian National University 2 Sparse methods A vector is S-sparse if it has at most S non-zero coefficients.

More information

Inverse problems, Dictionary based Signal Models and Compressed Sensing

Inverse problems, Dictionary based Signal Models and Compressed Sensing Inverse problems, Dictionary based Signal Models and Compressed Sensing Rémi Gribonval METISS project-team (audio signal processing, speech recognition, source separation) INRIA, Rennes, France Ecole d

More information

Interpolation via weighted l 1 minimization

Interpolation via weighted l 1 minimization Interpolation via weighted l 1 minimization Rachel Ward University of Texas at Austin December 12, 2014 Joint work with Holger Rauhut (Aachen University) Function interpolation Given a function f : D C

More information

Compressed Sensing: Lecture I. Ronald DeVore

Compressed Sensing: Lecture I. Ronald DeVore Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function

More information

A Structured Construction of Optimal Measurement Matrix for Noiseless Compressed Sensing via Polarization of Analog Transmission

A Structured Construction of Optimal Measurement Matrix for Noiseless Compressed Sensing via Polarization of Analog Transmission Li and Kang: A Structured Construction of Optimal Measurement Matrix for Noiseless Compressed Sensing 1 A Structured Construction of Optimal Measurement Matrix for Noiseless Compressed Sensing via Polarization

More information

ROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210

ROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210 ROBUST BLIND SPIKES DECONVOLUTION Yuejie Chi Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 4 ABSTRACT Blind spikes deconvolution, or blind super-resolution, deals with

More information

The Fundamentals of Compressive Sensing

The Fundamentals of Compressive Sensing The Fundamentals of Compressive Sensing Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Sensor Explosion Data Deluge Digital Revolution If we sample a signal

More information

Sparsity in Underdetermined Systems

Sparsity in Underdetermined Systems Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2

More information

Z Algorithmic Superpower Randomization October 15th, Lecture 12

Z Algorithmic Superpower Randomization October 15th, Lecture 12 15.859-Z Algorithmic Superpower Randomization October 15th, 014 Lecture 1 Lecturer: Bernhard Haeupler Scribe: Goran Žužić Today s lecture is about finding sparse solutions to linear systems. The problem

More information

Exact Topology Identification of Large-Scale Interconnected Dynamical Systems from Compressive Observations

Exact Topology Identification of Large-Scale Interconnected Dynamical Systems from Compressive Observations Exact Topology Identification of arge-scale Interconnected Dynamical Systems from Compressive Observations Borhan M Sanandaji, Tyrone Vincent, and Michael B Wakin Abstract In this paper, we consider the

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

SPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION. Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte

SPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION. Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte SPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte Dept. of Electronic Systems, Aalborg University, Denmark. Dept. of Electrical and Computer Engineering,

More information

Guaranteed Sparse Recovery under Linear Transformation

Guaranteed Sparse Recovery under Linear Transformation Ji Liu JI-LIU@CS.WISC.EDU Department of Computer Sciences, University of Wisconsin-Madison, Madison, WI 53706, USA Lei Yuan LEI.YUAN@ASU.EDU Jieping Ye JIEPING.YE@ASU.EDU Department of Computer Science

More information

SIGNALS with sparse representations can be recovered

SIGNALS with sparse representations can be recovered IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1497 Cramér Rao Bound for Sparse Signals Fitting the Low-Rank Model with Small Number of Parameters Mahdi Shaghaghi, Student Member, IEEE,

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan Connection between... Sparse Approximation and Compressed

More information

Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, and Vahid Tarokh, Fellow, IEEE

Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, and Vahid Tarokh, Fellow, IEEE 492 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 1, JANUARY 2010 Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, Vahid Tarokh, Fellow, IEEE Abstract

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

COMPRESSIVE SAMPLING USING EM ALGORITHM. Technical Report No: ASU/2014/4

COMPRESSIVE SAMPLING USING EM ALGORITHM. Technical Report No: ASU/2014/4 COMPRESSIVE SAMPLING USING EM ALGORITHM ATANU KUMAR GHOSH, ARNAB CHAKRABORTY Technical Report No: ASU/2014/4 Date: 29 th April, 2014 Applied Statistics Unit Indian Statistical Institute Kolkata- 700018

More information

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell

More information

ANALOG-TO-DIGITAL converters (ADCs) are an essential. Democracy in Action: Quantization, Saturation, and Compressive Sensing

ANALOG-TO-DIGITAL converters (ADCs) are an essential. Democracy in Action: Quantization, Saturation, and Compressive Sensing Democracy in Action: Quantization, Saturation, and Compressive Sensing Jason N. Laska, Petros T. Boufounos, Mark A. Davenport, and Richard G. Baraniuk Abstract Recent theoretical developments in the area

More information

An iterative hard thresholding estimator for low rank matrix recovery

An iterative hard thresholding estimator for low rank matrix recovery An iterative hard thresholding estimator for low rank matrix recovery Alexandra Carpentier - based on a joint work with Arlene K.Y. Kim Statistical Laboratory, Department of Pure Mathematics and Mathematical

More information

De-biasing the Lasso: Optimal Sample Size for Gaussian Designs

De-biasing the Lasso: Optimal Sample Size for Gaussian Designs De-biasing the Lasso: Optimal Sample Size for Gaussian Designs Adel Javanmard USC Marshall School of Business Data Science and Operations department Based on joint work with Andrea Montanari Oct 2015 Adel

More information

Sparse recovery for spherical harmonic expansions

Sparse recovery for spherical harmonic expansions Rachel Ward 1 1 Courant Institute, New York University Workshop Sparsity and Cosmology, Nice May 31, 2011 Cosmic Microwave Background Radiation (CMB) map Temperature is measured as T (θ, ϕ) = k k=0 l=

More information

Sparse and Low Rank Recovery via Null Space Properties

Sparse and Low Rank Recovery via Null Space Properties Sparse and Low Rank Recovery via Null Space Properties Holger Rauhut Lehrstuhl C für Mathematik (Analysis), RWTH Aachen Convexity, probability and discrete structures, a geometric viewpoint Marne-la-Vallée,

More information

Recovery of Compressible Signals in Unions of Subspaces

Recovery of Compressible Signals in Unions of Subspaces 1 Recovery of Compressible Signals in Unions of Subspaces Marco F. Duarte, Chinmay Hegde, Volkan Cevher, and Richard G. Baraniuk Department of Electrical and Computer Engineering Rice University Abstract

More information

Lecture 18 Nov 3rd, 2015

Lecture 18 Nov 3rd, 2015 CS 229r: Algorithms for Big Data Fall 2015 Prof. Jelani Nelson Lecture 18 Nov 3rd, 2015 Scribe: Jefferson Lee 1 Overview Low-rank approximation, Compression Sensing 2 Last Time We looked at three different

More information

Optimisation Combinatoire et Convexe.

Optimisation Combinatoire et Convexe. Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

1 Introduction to Compressed Sensing

1 Introduction to Compressed Sensing 1 Introduction to Compressed Sensing Mark A. Davenport Stanford University, Department of Statistics Marco F. Duarte Duke University, Department of Computer Science Yonina C. Eldar Technion, Israel Institute

More information

6 Compressed Sensing and Sparse Recovery

6 Compressed Sensing and Sparse Recovery 6 Compressed Sensing and Sparse Recovery Most of us have noticed how saving an image in JPEG dramatically reduces the space it occupies in our hard drives as oppose to file types that save the pixel value

More information

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Asilomar 2011 Jason T. Parker (AFRL/RYAP) Philip Schniter (OSU) Volkan Cevher (EPFL) Problem Statement Traditional

More information