Strengthened Sobolev inequalities for a random subspace of functions
|
|
- Chester May
- 6 years ago
- Views:
Transcription
1 Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013
2 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images) Let x R N N be zero-mean. Then N N N xj,k 2 N [ ] x j+1,k x j,k + x j,k+1 x j,k j=1 k=1 j=1 k=1 or x 2 x TV Theorem (New! Strengthened Sobolev inequality) With probability 1 e cm, the following holds for all images x R N N in the null space of an m N 2 random Gaussian matrix x 2 [log(n)]3/2 x TV m
3 3 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images) Let x R N N be zero-mean. Then N N N xj,k 2 N [ ] x j+1,k x j,k + x j,k+1 x j,k j=1 k=1 j=1 k=1 or x 2 x TV Theorem (New! Strengthened Sobolev inequality) With probability 1 e cm, the following holds for all images x R N N in the null space of an m N 2 random Gaussian matrix x 2 [log(n)]3/2 x TV m
4 4 Joint work with Deanna Needell Research supported in part by the ONR and Alfred P. Sloan Foundation
5 5 Motivation: Image processing
6 6 Images are compressible in discrete gradient
7 Images are compressible in discrete gradient We define the discrete directional derivatives of an image x R N N as x u : R N N R (N 1) N, (x u ) j,k = x j,k x j 1,k, x v : R N N R N (N 1), (x v ) j,k = x j,k x j,k 1, and discrete gradient operator x = (x u, x v ) R N N 2
8 8 Images are also compressible in wavelet bases Reconstruction of Boats from 10% of its Haar wavelet coefficients. H : R N N R N N is bivariate Haar wavelet transform Figure: First few Haar basis functions
9 Notation The image l p -norm is u p := ( N j=1 N k=1 u j,k p ) 1/p u is s-sparse if u 0 := {#(j, k) : u j,k 0} s u s = arg min {z: z 0=s} u z is the best s-sparse approximation to u σ s (u) p = u u s p is the s-sparse approximation error Natural images are compressible : σ s ( x) = x ( x) s 2 decays quickly in s σ s (Hx) = Hx (Hx) s 2 decays quickly in s
10 10 Notation The image l p -norm is u p := ( N j=1 N k=1 u j,k p ) 1/p u is s-sparse if u 0 := {#(j, k) : u j,k 0} s u s = arg min {z: z 0=s} u z is the best s-sparse approximation to u σ s (u) p = u u s p is the s-sparse approximation error Natural images are compressible : σ s ( x) = x ( x) s 2 decays quickly in s σ s (Hx) = Hx (Hx) s 2 decays quickly in s
11 Imaging via compressed sensing If images are so compressible (nearly low-dimensional), do we really need to know all of the pixel values for accurate reconstruction, or can we get away with taking a smaller number of linear measurements?
12 MRI imaging - speed up by exploiting sparsity? In Magnetic Resonance Imaging (MRI), rotating magnetic field measurements are 2D Fourier transform measurements: y k1,k 2 = ( F(x) ) = 1 x k 1,k j1 2,j N 2 e 2πi(k1j1+k2j2)/N, j 1,j 2 N/2 + 1 k 1, k 2, N/2 Each measurement takes a certain amount of time reduced number of samples m N 2 means reduced time of MRI scan.
13 Compressed sensing set-up 1. Signal (or image) of interest f R d which is sparse: f 0 s 2. Measurement operator A : R d R m, m d y = A f 3. Problem: Using the sparsity of f, can we reconstruct f efficiently from y? 3
14 A sufficient condition: Restricted Isometry Property 1 A : R d R m satisfies the Restricted Isometry Property (RIP) of order s if 3 4 f 2 Af f 2 for all s-sparse f. Gaussian or Bernoulli measurement matrices satisfy the RIP with high probability when m s log d. Random subsets of rows from Fourier and others with fast multiply have similar property: m s log 4 d Candès-Romberg-Tao 2006
15 A sufficient condition: Restricted Isometry Property 1 A : R d R m satisfies the Restricted Isometry Property (RIP) of order s if 3 4 f 2 Af f 2 for all s-sparse f. Gaussian or Bernoulli measurement matrices satisfy the RIP with high probability when m s log d. Random subsets of rows from Fourier and others with fast multiply have similar property: m s log 4 d. 1 Candès-Romberg-Tao 2006
16 16 l 1 -minimization for sparse recovery 2 Let y = Af. Let A satisfy the Restricted Isometry Property of order s and set: Then ˆf = argmin g 1 such that Ag = y g 1. Exact recovery: If f is s-sparse, then ˆf = f. 2. Stable recovery: If f is approximately s-sparse, then ˆf f. Precisely, f ˆf 1 σ s (f ) 1 f ˆf 2 1 s σ s (f ) 1 2 Candès-Romberg-Tao, Donoho
17 l 1 -minimization for sparse recovery Similarly, if B be an orthonormal basis in which f is assumed sparse, and AB = AB 1 satisfies the Restricted Isometry Property of order s and ˆf = argmin Bg 1 such that Ag = y g Bˆf = argmin h 1 such that AB 1 h = y h Then B(f ˆf ) 1 σ s (Bf ) 1 f ˆf 2 1 s σ s (Bf ) 1
18 18 Total variation minimization for sparse recovery For x R N N, the total variation seminorm is x TV = x 1. If x has s-sparse gradient, from observations y = Ax one solves Total Variation minimization ˆx = argmin z TV subject to Az = Ax z Immediate guarantees: 1. ˆx x 2 1 s σ s ( x) 1, 2. ˆx x 1 σ s ( x) 1 Apply discrete Sobolev inequality x ˆx 2 σ s ( x) 1 Can we do better?
19 19 Total variation minimization for sparse recovery For x R N N, the total variation seminorm is x TV = x 1. If x has s-sparse gradient, from observations y = Ax one solves Total Variation minimization ˆx = argmin z TV subject to Az = Ax z Immediate guarantees: 1. ˆx x 2 1 s σ s ( x) 1, 2. ˆx x 1 σ s ( x) 1 Apply discrete Sobolev inequality x ˆx 2 σ s ( x) 1 Can we do better?
20 Discrete Sobolev inequalities Proposition (Sobolev inequality ) Let x R N N be zero-mean. Then x 2 x TV Theorem (Strengthened Sobolev inequality) For all images x R N2 in the null space of a matrix A : R N2 R m which, when composed with the bivariate Haar wavelet transform, AH 1 : R N2 R m has the Restricted Isometry Property of order 2s, the following holds: x 2 log(n) x TV s
21 Discrete Sobolev inequalities Proposition (Sobolev inequality ) Let x R N N be zero-mean. Then x 2 x TV Theorem (Strengthened Sobolev inequality) For all images x R N2 in the null space of a matrix A : R N2 R m which, when composed with the bivariate Haar wavelet transform, AH 1 : R N2 R m has the Restricted Isometry Property of order 2s, the following holds: x 2 log(n) x TV s
22 22 Consequence for total variation minimization Suppose that the matrix A : R N2 R m is such that, composed with the bivariate Haar wavelet transform, AH : R N2 R m has the Restricted Isometry Property of order 2s. From observations y = Ax, consider Total Variation minimization Then ˆx = argmin z TV subject to Az = y z 1. ˆx x 2 1 s σ s ( x) 1 2. ˆx x TV σ s ( x) 1 3. x ˆx 2 log(n2 /s) s σ s ( x) 1
23 23 Consequence for total variation minimization Suppose that the matrix A : R N2 R m is such that, composed with the bivariate Haar wavelet transform, AH : R N2 R m has the Restricted Isometry Property of order 2s. From observations y = Ax, consider Total Variation minimization Then ˆx = argmin z TV subject to Az = y z 1. ˆx x 2 1 s σ s ( x) 1 2. ˆx x TV σ s ( x) 1 3. x ˆx 2 log(n2 /s) s σ s ( x) 1
24 Corollary (Sobolev inequality for random matrices) With probability 1 e cm, the following holds for all images x R N N in the null space of an m N 2 random Gaussian matrix: x 2 [log(n)]3/2 x TV m
25 Corollary (Sobolev inequality for Fourier constraints) Select m frequency pairs (k 1, k 2 ) with replacement from the distribution 1 p k1,k 2 ( k 1 + 1) 2 + ( k 2 + 1) 2. With probability 1 e cm, the following holds for all images x R N N with vanishing Fourier transform ( F(x) ) k 1,k 2 = 0: x 2 [log(n)]5 x TV m
26 Sobolev inequalities in action! 100 Original MRI image Low frequencies only Equispaced radial lines (k 1, k 2 ) (k k2 2 ) 1/2 (k 1, k 2 ) (k k2 2 ) 1
27 27 Sketch of the proof Theorem (Strengthened Sobolev inequality) For all images x R N2 in the null space of a matrix A : R N2 R m which, when composed with the bivariate Haar wavelet transform, AH 1 : R N2 R m has the Restricted Isometry Property of order 2s, the following holds: x 2 log(n) x TV s
28 28 Sketch of the proof Theorem (Strengthened Sobolev inequality) For all images x R N2 in the null space of a matrix A : R N2 R m which, when composed with the bivariate Haar wavelet transform, AH 1 : R N2 R m has the Restricted Isometry Property of order 2s, the following holds: x 2 log(n) x TV s
29 29 Lemmas we use: 1. Denote the ordered bivariate Haar wavelet coefficients of mean-zero x R N2 by c (1) c (2) c (N 2 ). Then 3 c (k) x TV k That is, the sequence is in weak-l If c = c S0 + c S1 + c S where c S0 = (c (1), c (2),..., c (s), 0,..., 0), and c S1 = (0,..., 0, c (s+1),..., c (2s), 0,..., 0), and so on are s-sparse, then c Sj s c Sj Recall that AH 1 : R N2 R m having the RIP of order s means that: 3 4 x 2 AH 1 x x 2 s-sparse x. 3 Cohen, Devore, Petrushev, Xu, 1999
30 30 Suppose that Ψ = AH : R N2 R m has the RIP of order 2s. Suppose Ax = 0 Decompose c = Hx into ordered s-sparse blocks c = c S0 + c S1 + c S Then Ψc = AH Hx = Ax = 0 and 0 Ψ(c S0 + c S1 ) 2 Ψc Sj 2 j 2 (RIP of Ψ) 3 4 c S 0 + c S1 2 5 c Sj 2 4 j 2 (block trick) 3 4 c S 0 2 5/4 c Sj 1 s (c in weak l 1 ) 3 4 c S 0 2 5/4 s x TV log(n 2 /s) j 1
31 31 Suppose that Ψ = AH : R N2 R m has the RIP of order 2s. Suppose Ax = 0 Decompose c = Hx into ordered s-sparse blocks c = c S0 + c S1 + c S Then Ψc = AH Hx = Ax = 0 and 0 Ψ(c S0 + c S1 ) 2 Ψc Sj 2 j 2 (RIP of Ψ) 3 4 c S 0 + c S1 2 5 c Sj 2 4 j 2 (block trick) 3 4 c S 0 2 5/4 c Sj 1 s (c in weak l 1 ) 3 4 c S 0 2 5/4 s x TV log(n 2 /s) j 1
32 2 So 1. c S0 2 1 s x TV log(n/s), 2. c c S0 2 1 s x TV (c is in weak l 1 ) Then x 2 = c 2 c S0 2 + c c S0 2 log(n2 /s) x TV s
33 Extensions 1. Strengthened Sobolev inequalities and total variation minimization guarantees can be extended to multidimensional signals. 2. Strengthened Sobolev inequalities and total variation minimization guarantees are robust to noisy measurements: y = Ax + ξ, with ξ 2 ε. 3. Sobolev inequalities for random subspaces in dimension d = 1? Error guarantees for total variation minimization in dimension d = 1? 4. Can we remove the extra factor of log(n) in the derived Sobolev inequalities? 5. Deterministic constructions of subspaces satisfying strengthened Sobolev inequalities (warning: hard!!!) 6....
34 34 References Needell, Ward. Stable image reconstruction using total variation minimization. SIAM Journal on Imaging Sciences 6.2 (2013): Needell, Ward. Near-optimal compressed sensing guarantees for total variation minimization. IEEE transactions on image processing (2013): Krahmer, Ward, Beyond incoherence: stable and robust sampling strategies for compressive imaging. arxiv preprint arxiv: (2012).
Recovering overcomplete sparse representations from structured sensing
Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix
More informationarxiv: v9 [cs.cv] 12 Mar 2013
Stable image reconstruction using total variation minimization Deanna Needell Rachel Ward March 13, 2013 arxiv:1202.6429v9 [cs.cv] 12 Mar 2013 Abstract This article presents near-optimal guarantees for
More informationCompressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles
Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional
More informationInterpolation via weighted l 1 minimization
Interpolation via weighted l 1 minimization Rachel Ward University of Texas at Austin December 12, 2014 Joint work with Holger Rauhut (Aachen University) Function interpolation Given a function f : D C
More informationAN INTRODUCTION TO COMPRESSIVE SENSING
AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE
More informationNoisy Signal Recovery via Iterative Reweighted L1-Minimization
Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.
More informationOn the coherence barrier and analogue problems in compressed sensing
On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)
More informationBlind Deconvolution Using Convex Programming. Jiaming Cao
Blind Deconvolution Using Convex Programming Jiaming Cao Problem Statement The basic problem Consider that the received signal is the circular convolution of two vectors w and x, both of length L. How
More informationCS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5
CS 229r: Algorithms for Big Data Fall 215 Prof. Jelani Nelson Lecture 19 Nov 5 Scribe: Abdul Wasay 1 Overview In the last lecture, we started discussing the problem of compressed sensing where we are given
More informationCoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp
CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell
More informationCOMPRESSED Sensing (CS) is a method to recover a
1 Sample Complexity of Total Variation Minimization Sajad Daei, Farzan Haddadi, Arash Amini Abstract This work considers the use of Total Variation (TV) minimization in the recovery of a given gradient
More informationGreedy Signal Recovery and Uniform Uncertainty Principles
Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles
More informationNear Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing
Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar
More informationExponential decay of reconstruction error from binary measurements of sparse signals
Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation
More informationRandom projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016
Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use
More informationSparse Legendre expansions via l 1 minimization
Sparse Legendre expansions via l 1 minimization Rachel Ward, Courant Institute, NYU Joint work with Holger Rauhut, Hausdorff Center for Mathematics, Bonn, Germany. June 8, 2010 Outline Sparse recovery
More informationCompressive Sensing and Beyond
Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered
More informationLecture: Introduction to Compressed Sensing Sparse Recovery Guarantees
Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin
More informationLecture Notes 9: Constrained Optimization
Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form
More informationCoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles
CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal
More informationAn algebraic perspective on integer sparse recovery
An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:
More informationSensing systems limited by constraints: physical size, time, cost, energy
Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original
More informationReconstruction from Anisotropic Random Measurements
Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013
More informationRecent Developments in Compressed Sensing
Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline
More informationCompressed Sensing and Sparse Recovery
ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing
More informationNear-optimal Compressed Sensing Guarantees for Total Variation Minimization
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 5-22-213 Near-optimal Compressed Sensing Guarantees for Total Variation Minimization Deanna Needell
More informationModel-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk
Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional
More informationLecture 22: More On Compressed Sensing
Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an
More informationMinimizing the Difference of L 1 and L 2 Norms with Applications
1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:
More informationUniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 6-5-2008 Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
More informationInterpolation via weighted l 1 -minimization
Interpolation via weighted l 1 -minimization Holger Rauhut RWTH Aachen University Lehrstuhl C für Mathematik (Analysis) Matheon Workshop Compressive Sensing and Its Applications TU Berlin, December 11,
More informationCombining geometry and combinatorics
Combining geometry and combinatorics A unified approach to sparse signal recovery Anna C. Gilbert University of Michigan joint work with R. Berinde (MIT), P. Indyk (MIT), H. Karloff (AT&T), M. Strauss
More informationSparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery
Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery Anna C. Gilbert Department of Mathematics University of Michigan Sparse signal recovery measurements:
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationof Orthogonal Matching Pursuit
A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement
More informationRui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China
Acta Mathematica Sinica, English Series May, 015, Vol. 31, No. 5, pp. 755 766 Published online: April 15, 015 DOI: 10.1007/s10114-015-434-4 Http://www.ActaMath.com Acta Mathematica Sinica, English Series
More informationSparse Optimization Lecture: Sparse Recovery Guarantees
Those who complete this lecture will know Sparse Optimization Lecture: Sparse Recovery Guarantees Sparse Optimization Lecture: Sparse Recovery Guarantees Instructor: Wotao Yin Department of Mathematics,
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline
More informationInterpolation via weighted l 1 -minimization
Interpolation via weighted l 1 -minimization Holger Rauhut RWTH Aachen University Lehrstuhl C für Mathematik (Analysis) Mathematical Analysis and Applications Workshop in honor of Rupert Lasser Helmholtz
More informationCompressive Sensing of Sparse Tensor
Shmuel Friedland Univ. Illinois at Chicago Matheon Workshop CSA2013 December 12, 2013 joint work with Q. Li and D. Schonfeld, UIC Abstract Conventional Compressive sensing (CS) theory relies on data representation
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationStability and Robustness of Weak Orthogonal Matching Pursuits
Stability and Robustness of Weak Orthogonal Matching Pursuits Simon Foucart, Drexel University Abstract A recent result establishing, under restricted isometry conditions, the success of sparse recovery
More informationSignal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit
Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Deanna Needell and Roman Vershynin Abstract We demonstrate a simple greedy algorithm that can reliably
More informationStable Signal Recovery from Incomplete and Inaccurate Measurements
Stable Signal Recovery from Incomplete and Inaccurate Measurements EMMANUEL J. CANDÈS California Institute of Technology JUSTIN K. ROMBERG California Institute of Technology AND TERENCE TAO University
More informationSparse recovery for spherical harmonic expansions
Rachel Ward 1 1 Courant Institute, New York University Workshop Sparsity and Cosmology, Nice May 31, 2011 Cosmic Microwave Background Radiation (CMB) map Temperature is measured as T (θ, ϕ) = k k=0 l=
More informationConstructing Explicit RIP Matrices and the Square-Root Bottleneck
Constructing Explicit RIP Matrices and the Square-Root Bottleneck Ryan Cinoman July 18, 2018 Ryan Cinoman Constructing Explicit RIP Matrices July 18, 2018 1 / 36 Outline 1 Introduction 2 Restricted Isometry
More informationConditions for Robust Principal Component Analysis
Rose-Hulman Undergraduate Mathematics Journal Volume 12 Issue 2 Article 9 Conditions for Robust Principal Component Analysis Michael Hornstein Stanford University, mdhornstein@gmail.com Follow this and
More informationIntroduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011
Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear
More informationExact Low-rank Matrix Recovery via Nonconvex M p -Minimization
Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization Lingchen Kong and Naihua Xiu Department of Applied Mathematics, Beijing Jiaotong University, Beijing, 100044, People s Republic of China E-mail:
More informationRapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization
Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Shuyang Ling Department of Mathematics, UC Davis Oct.18th, 2016 Shuyang Ling (UC Davis) 16w5136, Oaxaca, Mexico Oct.18th, 2016
More informationINDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina
INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed
More informationTHEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS
THEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS YIN ZHANG Abstract. Compressive sensing (CS) is an emerging methodology in computational signal processing that has
More informationSuper-resolution via Convex Programming
Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation
More informationKomprimované snímání a LASSO jako metody zpracování vysocedimenzionálních dat
Komprimované snímání a jako metody zpracování vysocedimenzionálních dat Jan Vybíral (Charles University Prague, Czech Republic) November 2014 VUT Brno 1 / 49 Definition and motivation Its use in bioinformatics
More informationCompressed Sensing: Lecture I. Ronald DeVore
Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More informationCompressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery
Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad
More informationCOMPRESSED Sensing (CS) has recently emerged as a. Sparse Recovery from Combined Fusion Frame Measurements
IEEE TRANSACTION ON INFORMATION THEORY, VOL 57, NO 6, JUNE 2011 1 Sparse Recovery from Combined Fusion Frame Measurements Petros Boufounos, Member, IEEE, Gitta Kutyniok, and Holger Rauhut Abstract Sparse
More informationStability and robustness of l 1 -minimizations with Weibull matrices and redundant dictionaries
Stability and robustness of l 1 -minimizations with Weibull matrices and redundant dictionaries Simon Foucart, Drexel University Abstract We investigate the recovery of almost s-sparse vectors x C N from
More information6 Compressed Sensing and Sparse Recovery
6 Compressed Sensing and Sparse Recovery Most of us have noticed how saving an image in JPEG dramatically reduces the space it occupies in our hard drives as oppose to file types that save the pixel value
More informationThe Pros and Cons of Compressive Sensing
The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal
More informationMethods for sparse analysis of high-dimensional data, II
Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional
More informationUniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit
Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,
More informationCOMPRESSED SENSING IN PYTHON
COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed
More informationCompressed Sensing and Robust Recovery of Low Rank Matrices
Compressed Sensing and Robust Recovery of Low Rank Matrices M. Fazel, E. Candès, B. Recht, P. Parrilo Electrical Engineering, University of Washington Applied and Computational Mathematics Dept., Caltech
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear
More informationA new method on deterministic construction of the measurement matrix in compressed sensing
A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central
More informationObservability of a Linear System Under Sparsity Constraints
2372 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 58, NO 9, SEPTEMBER 2013 Observability of a Linear System Under Sparsity Constraints Wei Dai and Serdar Yüksel Abstract Consider an -dimensional linear
More informationThe uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008
The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008 Emmanuel Candés (Caltech), Terence Tao (UCLA) 1 Uncertainty principles A basic principle
More informationCompressive Sensing Theory and L1-Related Optimization Algorithms
Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:
More informationSelf-Calibration and Biconvex Compressive Sensing
Self-Calibration and Biconvex Compressive Sensing Shuyang Ling Department of Mathematics, UC Davis July 12, 2017 Shuyang Ling (UC Davis) SIAM Annual Meeting, 2017, Pittsburgh July 12, 2017 1 / 22 Acknowledgements
More informationSparse and Low Rank Recovery via Null Space Properties
Sparse and Low Rank Recovery via Null Space Properties Holger Rauhut Lehrstuhl C für Mathematik (Analysis), RWTH Aachen Convexity, probability and discrete structures, a geometric viewpoint Marne-la-Vallée,
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationThe Fundamentals of Compressive Sensing
The Fundamentals of Compressive Sensing Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Sensor Explosion Data Deluge Digital Revolution If we sample a signal
More informationPart IV Compressed Sensing
Aisenstadt Chair Course CRM September 2009 Part IV Compressed Sensing Stéphane Mallat Centre de Mathématiques Appliquées Ecole Polytechnique Conclusion to Super-Resolution Sparse super-resolution is sometime
More informationCompressed Sensing and Related Learning Problems
Compressed Sensing and Related Learning Problems Yingzhen Li Dept. of Mathematics, Sun Yat-sen University Advisor: Prof. Haizhang Zhang Advisor: Prof. Haizhang Zhang 1 / Overview Overview Background Compressed
More informationCompressive Sensing with Random Matrices
Compressive Sensing with Random Matrices Lucas Connell University of Georgia 9 November 017 Lucas Connell (University of Georgia) Compressive Sensing with Random Matrices 9 November 017 1 / 18 Overview
More informationTowards a Mathematical Theory of Super-resolution
Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This
More informationLecture 3. Random Fourier measurements
Lecture 3. Random Fourier measurements 1 Sampling from Fourier matrices 2 Law of Large Numbers and its operator-valued versions 3 Frames. Rudelson s Selection Theorem Sampling from Fourier matrices Our
More informationEnhanced Compressive Sensing and More
Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University
More informationMethods for sparse analysis of high-dimensional data, II
Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional
More informationRecovery of Compressible Signals in Unions of Subspaces
1 Recovery of Compressible Signals in Unions of Subspaces Marco F. Duarte, Chinmay Hegde, Volkan Cevher, and Richard G. Baraniuk Department of Electrical and Computer Engineering Rice University Abstract
More informationTV-MIN AND GREEDY PURSUIT FOR CONSTRAINED JOINT SPARSITY AND APPLICATION TO INVERSE SCATTERING
TV-MIN AND GREEDY PURSUIT FOR CONSTRAINED JOINT SPARSITY AND APPLICATION TO INVERSE SCATTERING ALBERT FANNJIANG Abstract. This paper proposes an appealing framework for analyzing total variation minimization
More informationRSP-Based Analysis for Sparsest and Least l 1 -Norm Solutions to Underdetermined Linear Systems
1 RSP-Based Analysis for Sparsest and Least l 1 -Norm Solutions to Underdetermined Linear Systems Yun-Bin Zhao IEEE member Abstract Recently, the worse-case analysis, probabilistic analysis and empirical
More informationFast Hard Thresholding with Nesterov s Gradient Method
Fast Hard Thresholding with Nesterov s Gradient Method Volkan Cevher Idiap Research Institute Ecole Polytechnique Federale de ausanne volkan.cevher@epfl.ch Sina Jafarpour Department of Computer Science
More informationSolution-recovery in l 1 -norm for non-square linear systems: deterministic conditions and open questions
Solution-recovery in l 1 -norm for non-square linear systems: deterministic conditions and open questions Yin Zhang Technical Report TR05-06 Department of Computational and Applied Mathematics Rice University,
More informationError Correction via Linear Programming
Error Correction via Linear Programming Emmanuel Candes and Terence Tao Applied and Computational Mathematics, Caltech, Pasadena, CA 91125 Department of Mathematics, University of California, Los Angeles,
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationarxiv: v1 [cs.it] 21 Feb 2013
q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto
More informationMultiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing
Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing ibeltal F. Alem, Daniel H. Chae, and Rodney A. Kennedy The Australian National
More informationAn Overview of Compressed Sensing
An Overview of Compressed Sensing Nathan Schneider November 18, 2009 Abstract In a large number of applications, the system will be designed to sample at a rate equal to at least the frequency bandwidth
More informationCompressed sensing techniques for hyperspectral image recovery
Compressed sensing techniques for hyperspectral image recovery A. Abrardo, M. Barni, C. M. Carretti, E. Magli, S. Kuiteing Kamdem, R. Vitulli ABSTRACT Compressed Sensing (CS) theory is progressively gaining
More informationPrimal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector
Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer
More informationECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference
ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via nonconvex optimization Yuejie Chi Department of Electrical and Computer Engineering Spring
More informationRandomness-in-Structured Ensembles for Compressed Sensing of Images
Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder
More informationStochastic geometry and random matrix theory in CS
Stochastic geometry and random matrix theory in CS IPAM: numerical methods for continuous optimization University of Edinburgh Joint with Bah, Blanchard, Cartis, and Donoho Encoder Decoder pair - Encoder/Decoder
More informationOptimisation Combinatoire et Convexe.
Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix
More informationLecture 18 Nov 3rd, 2015
CS 229r: Algorithms for Big Data Fall 2015 Prof. Jelani Nelson Lecture 18 Nov 3rd, 2015 Scribe: Jefferson Lee 1 Overview Low-rank approximation, Compression Sensing 2 Last Time We looked at three different
More informationGradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property
: An iterative algorithm for sparse recovery with restricted isometry property Rahul Garg grahul@us.ibm.com Rohit Khandekar rohitk@us.ibm.com IBM T. J. Watson Research Center, 0 Kitchawan Road, Route 34,
More information