Compressive Sensing and Beyond

Similar documents
The Fundamentals of Compressive Sensing

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

Sensing systems limited by constraints: physical size, time, cost, energy

Greedy Signal Recovery and Uniform Uncertainty Principles

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

Generalized Orthogonal Matching Pursuit- A Review and Some

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Strengthened Sobolev inequalities for a random subspace of functions

of Orthogonal Matching Pursuit

Introduction to Compressed Sensing

Three Generalizations of Compressed Sensing

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Structured matrix factorizations. Example: Eigenfaces

The Pros and Cons of Compressive Sensing

Greedy Sparsity-Constrained Optimization

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

AN INTRODUCTION TO COMPRESSIVE SENSING

An Introduction to Sparse Approximation

Exponential decay of reconstruction error from binary measurements of sparse signals

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Recovering overcomplete sparse representations from structured sensing

Minimizing the Difference of L 1 and L 2 Norms with Applications

Compressed Sensing: Extending CLEAN and NNLS

COMPRESSED SENSING IN PYTHON

Stochastic geometry and random matrix theory in CS

A new method on deterministic construction of the measurement matrix in compressed sensing

Multipath Matching Pursuit

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming)

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery

The Analysis Cosparse Model for Signals and Images

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

GREEDY SIGNAL RECOVERY REVIEW

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

New Applications of Sparse Methods in Physics. Ra Inta, Centre for Gravitational Physics, The Australian National University

Compressive Sensing Theory and L1-Related Optimization Algorithms

Estimating Unknown Sparsity in Compressed Sensing

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Compressed Sensing and Neural Networks

Fast Hard Thresholding with Nesterov s Gradient Method

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Gradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property

Signal Sparsity Models: Theory and Applications

Lecture 22: More On Compressed Sensing

Exact Topology Identification of Large-Scale Interconnected Dynamical Systems from Compressive Observations

Part IV Compressed Sensing

ANALOG-TO-DIGITAL converters (ADCs) are an essential. Democracy in Action: Quantization, Saturation, and Compressive Sensing

Recovery of Compressible Signals in Unions of Subspaces

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference

Compressive Sensing (CS)

Sparse and Low Rank Recovery via Null Space Properties

The Pros and Cons of Compressive Sensing

ORTHOGONAL matching pursuit (OMP) is the canonical

Low-Dimensional Signal Models in Compressive Sensing

Recent Developments in Compressed Sensing

Introduction to compressive sampling

Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements

AN OVERVIEW OF ROBUST COMPRESSIVE SENSING OF SPARSE SIGNALS IN IMPULSIVE NOISE

Compressed Spectrum Sensing for Whitespace Networking

Robust Principal Component Analysis

A Survey of Compressive Sensing and Applications

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5

Sparsity in Underdetermined Systems

Analysis of Greedy Algorithms

A Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia

Block-sparse Solutions using Kernel Block RIP and its Application to Group Lasso

Generalized greedy algorithms.

Optimisation Combinatoire et Convexe.

Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit

Iterative Hard Thresholding for Compressed Sensing

ANALOG-TO-DIGITAL converters (ADCs) are an essential. Democracy in Action: Quantization, Saturation, and Compressive Sensing

Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity

Reconstruction from Anisotropic Random Measurements

Bhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER

Compressed Sensing: Lecture I. Ronald DeVore

THEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS

Performance Analysis for Sparse Support Recovery

Super-resolution via Convex Programming

Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization

Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications

Sparse Optimization Lecture: Sparse Recovery Guarantees

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit

Compressive Sensing of Streams of Pulses

Sketching for Large-Scale Learning of Mixture Models

Conditions for Robust Principal Component Analysis

Topics in Compressed Sensing

Recovery of Sparse Signals Using Multiple Orthogonal Least Squares

Elaine T. Hale, Wotao Yin, Yin Zhang

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Democracy in Action: Quantization, Saturation and Compressive Sensing

Low-rank Matrix Completion with Noisy Observations: a Quantitative Comparison

Transcription:

Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech.

Signal Processing Compressed Sensing

Signal Models

Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered from its samples collected uniformly at a rate no less that f s = 2B. Claude Shannon Harry Nyquist 1 x(t) = n Z x n sinc f s t n 0 x n = x t, sinc f s t n -3-2 -1 0 1 2 3 1

Classics: time-frequency X(τ, ω) = x t, w t τ e jωt x t = 1 2π R R X τ, ω e jωt dωdτ ψ m,n (t) = a m 2 ψ a m t nb x t = x t, ψ m,n (t) ψ m,n (t) m Z n Z D Gabor A. Haar I. Daubechies R. Coifman 2

Sparsity x F x = x Ω F Ω x T. Tao E. Candès J. Romberg = D. Donoho 3

Structured sparsity Tree Structured Models Graph Structured Models Discrete Wavelet Transform Protein-Protein Interactions Gene Regulatory Network Couch Lab, George Mason University, http://osf1.gmu.edu/~rcouch/protprot.jpg 4

Low rank Model for Images B. Recht Other applications minimum order linear system realization low-rank matrix completion low-dimensional Euclidean embedding M.Fazel P. Parrilo 5

Compressive Sensing

Linear inverse problems y A x Ax = y m = n x x = A y m < n No unique inverse, but 6

Sparse solutions argmin x 0 x subject to Ax = y x 1 x 0 = supp x supp x = i x i 0 x 2 Ax 1 Generally NP-hard! Tractable for many interesting A Ax 2 7

Specialize A Many sufficient conditions proposed Nullspace property, Incoherence, Restricted Eigenvalue, Restricted Isometry Property 1 δ k x 2 2 Ax 2 2 1 + δ k x 2 2, x: x 0 k x 1 d Ax 1 1 ± δd x 2 Ax 2 8

Randomness comes to rescue Random matrices can exhibit RIP: Random matrices with iid entries Gaussian, Rademacher (symmetric Bernoulli), Uniform Structured random matrices Random partial Fourier matrices Random circulant matrices RIP holds with high probability if m k log γ n 9

Basis Pursuit (l 1 -minimization) argmin x 1 x subject to Ax = y x 1 = i x i Theorem [Candès] If A satisfies δ 2k -RIP with δ 2k < 2 1, then BP recovers any k-sparse target exactly. 10

Value Basis Pursuit (l 1 -minimization) 1.5 Spar se Signal ` 1 Reconst r uct ion 1 CVX 0.5 0-0.5-1 -1.5-2 0 100 200 300 400 500 600 700 800 900 1000 I ndex 11

Greedy algorithms OMP, StOMP, IHT, CoSAMP, SP, Iteratively estimate the support and the values on the support Many of them have RIP-based convergence guarantees CoSaMP (Needell and Tropp) Proxy vector z = A T (Ax y) supp x Z = supp z 2k 2 b = argmin x Ax y 2 s. t. x T c= 0 Converges with δ 4k 0.1 x z b 1 4 5 3 2 Z T b k Update 12

Rice single-pixel camera Wakin et al oroginal object dim. reduction @ 20% dim. reduction @ 40% 13

Compressive MRI Traditional MRI Lustig et al CS MRI Acquisition Speed 14

Image super-resolution Yang et al D h α Sparse representation α D l α CS-based reconstruction Original 15

Low-Rank Matrix Recovery

Linear systems w/ low-rank solutions r y i = X = n 2 n U V 1 r, A i Simple least squares requires n 1 n 2 measurements Degrees of freedom is r n 1 + n 2 1 Can we close the gap? 16

Rank minimization argmin rank X X subject to A X = y If identifiable, it recovers the low-rank solution exactly Just like l 0 -minimization, it is generally NP-hard Special measurement operators A admit efficient solvers 17

Random measurements A i with iid entries exhibit low-rank RIP Gaussian Rademacher Universal Uniform Some structured matrices Rank-one A i = a i b i A i with dependent entries Instance optimal Standard RIP usually doesn t hold, but other approaches exist 18

Nuclear-norm minimization argmin X X subject to A X = y A X = y X = i σ i X Theorem [Recht, Fazel, Parrilo] If A obeys δ 5r -RIP with δ 5r < 0.1 then nuclear-norm minimization recovers any rank-r target exactly. 19

Matrix completion Alice 4 5? 3?? 1 Bob? 2? 4 5??... Yvon 5? 4 3 2 2? Zelda 3 2?? 5 2 2 Theorem [Candès and Recht] If M is a rank-r matrix from the random orthogonal model, then w.h.p. the nuclear-norm minimization recovers M exactly from O(rn 5 4 log n) uniformly observed entries, where n = n 1 n 2. 20

Robust PCA Ordinary PCA is sensitive to noise and outliers low-rank component Background Modeling M = L + S argmin L,S L + λ S 1 subject to P Ωobs L + S = Y sparse component Candès et al 21

Compressive multiplexing Ahmed and Romberg B x m t = α m ω e j2πωt ω= B Ω~R M + W log 3 MW W = 2B + 1 22

Nonlinear CS

Sparsity-constrained minimization Squared error isn t always appropriate example : non-gaussian noise models argmin x f x subject to x 0 k It s challenging in its general form Objectives with certain properties allow approximations convex relaxation: l 1 -regularization greedy methods 23

Gene classification a i : microarray sample y i : binary sample label (healthy/diseased) model: p y a; x DNA Microarray Sparse Logistic Regression m x : sparse weights for genes argmin log 1 + e a Tx i y i a T i x x i=1 subject to x 0 k, x 2 1 24

Imaging under photon noise Photon noise follows a Poisson distribution y Ax i p y A; x = i y i! i Physical constraints A 0, Ax 0, x 0 SPIRAL-TAP argmin x 0 e Ax i log p y A; x + τr(x) Harmany et al x 1 W T x 1 x TV 25

Coherent diffractive imaging Szameit et al * I : diffracted field intensity Cx : real image * C : Shifts of generating function L : low-pass filter F : Fourier transform f x = LFCx 2 I 2 2 * Lima et al., ESRF (www.esrf.eu) 26

Gradient Support Pursuit w/ Raj and Boufounos Generalizes CoSaMP x 1 z 4 b 5 b k Proxy vector z = f x Z = supp z 2k b = argmin x f x s. t. x T c= 0 supp x 2 3 Z T Converges under SRH the Hessian 2 f x is well-conditioned when restricted to sparse subspaces a generalization of the RIP Update 27

There s much more 28

Resources a collection of CS papers, tutorials, and softwares http://dsp.rice.edu/cs sparse and low-rank algorithms wiki http://ugcs.caltech.edu/~srbecker/wiki/main_page a weblog focusing on CS and broader computational areas http://nuit-blanche.blogspot.com/ 29