Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Similar documents
Recovery of Compressible Signals in Unions of Subspaces

Greedy Signal Recovery and Uniform Uncertainty Principles

The Fundamentals of Compressive Sensing

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

Introduction to Compressed Sensing

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

Sparse Signal Recovery Using Markov Random Fields

The Pros and Cons of Compressive Sensing

An Introduction to Sparse Approximation

Compressive Sensing and Beyond

GREEDY SIGNAL RECOVERY REVIEW

Exact Signal Recovery from Sparsely Corrupted Measurements through the Pursuit of Justice

Generalized Orthogonal Matching Pursuit- A Review and Some

Compressive Sensing Recovery of Spike Trains Using A Structured Sparsity Model

The Analysis Cosparse Model for Signals and Images

Multipath Matching Pursuit

ORTHOGONAL matching pursuit (OMP) is the canonical

Compressive Sensing of Streams of Pulses

of Orthogonal Matching Pursuit

Compressed Sensing: Extending CLEAN and NNLS

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Exponential decay of reconstruction error from binary measurements of sparse signals

A new method on deterministic construction of the measurement matrix in compressed sensing

Strengthened Sobolev inequalities for a random subspace of functions

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

BARANIUK et al.: MODEL-BASED COMPRESSIVE SENSING 1. Model-Based Compressive Sensing

Compressive sensing in the analog world

Exact Topology Identification of Large-Scale Interconnected Dynamical Systems from Compressive Observations

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery

Stochastic geometry and random matrix theory in CS

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

The Pros and Cons of Compressive Sensing

DISTRIBUTED TARGET LOCALIZATION VIA SPATIAL SPARSITY

arxiv: v2 [cs.lg] 6 May 2017

Stopping Condition for Greedy Block Sparse Signal Recovery

Compressed Sensing and Related Learning Problems

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD

Low-Dimensional Models for Dimensionality Reduction and Signal Recovery: A Geometric Perspective

Simultaneous Sparsity

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk. ECE Department, Rice University, Houston, TX

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

arxiv: v1 [math.na] 26 Nov 2009

Collaborative Compressive Spectrum Sensing Using Kronecker Sparsifying Basis

On Rank Awareness, Thresholding, and MUSIC for Joint Sparse Recovery

Fast Hard Thresholding with Nesterov s Gradient Method

Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit

Interpolation via weighted l 1 minimization

CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT

Compressed Sensing: Lecture I. Ronald DeVore

SPARSE signal processing has recently been exploited in

Sparse recovery using sparse random matrices

Compressive Sensing Theory and L1-Related Optimization Algorithms

Signal Sparsity Models: Theory and Applications

AN OVERVIEW OF ROBUST COMPRESSIVE SENSING OF SPARSE SIGNALS IN IMPULSIVE NOISE

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

AN INTRODUCTION TO COMPRESSIVE SENSING

An Overview of Compressed Sensing

Sensing systems limited by constraints: physical size, time, cost, energy

Sparse analysis Lecture III: Dictionary geometry and greedy algorithms

Gradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property

Lecture Notes 9: Constrained Optimization

Z Algorithmic Superpower Randomization October 15th, Lecture 12

Joint Direction-of-Arrival and Order Estimation in Compressed Sensing using Angles between Subspaces

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise

Block-sparse Solutions using Kernel Block RIP and its Application to Group Lasso

Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity

Sparse Recovery Using Sparse (Random) Matrices

Performance Analysis for Sparse Support Recovery

Bhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego

Tree-Structured Compressive Sensing with Variational Bayesian Analysis

Compressed Sensing Using Bernoulli Measurement Matrices

Estimating Unknown Sparsity in Compressed Sensing

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Designing Information Devices and Systems I Discussion 13B

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Combining geometry and combinatorics

Near-Optimal Sparse Recovery in the L 1 norm

Robust multichannel sparse recovery

Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery

Generalized Orthogonal Matching Pursuit

Compressed sensing techniques for hyperspectral image recovery

Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications

THEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS

Compressive Sensing for Signal Ensembles

Introduction to compressive sampling

Analog-to-Information Conversion

Tutorial: Sparse Recovery Using Sparse Matrices. Piotr Indyk MIT

Iterative Hard Thresholding for Compressed Sensing

Sparse Expander-like Real-valued Projection (SERP) matrices for compressed sensing

Stability and Robustness of Weak Orthogonal Matching Pursuits

Enhanced Compressive Sensing and More

Signal Recovery from Permuted Observations

Sampling and Recovery of Pulse Streams

Compressed Spectrum Sensing for Whitespace Networking

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

Transcription:

Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces aligned with coordinate axes x(n) x sorted index

Compressive Sensing Replace samples by more general encoder based on a few linear projections (inner products) Recover x from y using optimization ( 1 -norm minimization, LPs, QPs) or greedy algorithms (OMP, CoSaMP, SP, etc.), x is sparse y x measurements sparse signal information rate

Restricted Isometry Property (RIP) Preserve the structure of sparse signals RIP of order 2K implies: for all K-sparse x 1 and x 2 K-planes [Candès and Tao]

Restricted Isometry Property (RIP) Preserve the structure of sparse signals Random (iid Gaussian, Bernoulli) matrix has the RIP with high probability if K-planes [Candès and Tao; Baraniuk, Davenport, DeVore and Wakin]

Sensor Networks Networks of many sensor nodes sensor, microprocessor for computation, wireless communication, networking, battery sensors observe single event, acquire correlated signals Must be energy efficient minimize communication at expense of off-site computation motivates distributed compression

Distributed Compressive Sensing (DCS) Distributed Sensing y 1 = Φ 1 x 1 y 2 = Φ 2 x 2. y J = Φ J x J [Baron, Duarte, Wakin, Sarvotham, Baraniuk]

Distributed Compressive Sensing (DCS) Distributed Sensing Y Joint Recovery X [Baron, Duarte, Wakin, Sarvotham, Baraniuk]

JSM-2: Common Sparse Supports Model Measure J signals, each K-sparse Signals share sparse components but with different coefficients Recovery using Simultaneous Orthogonal Matching Pursuit (SOMP) algorithm [Tropp, Gilbert, Strauss] x 1 x 2 x 3 x J-1 x J

Beyond Sparse Models Sparse signal model captures only simplistic primary structure For many signal types, location of nonzero coefficients in sparse representation provide additional structure wavelets: natural images pixels: background subtracted images

Beyond Sparse Models Sparse signal model captures only simplistic primary structure For many signal types, location of nonzero coefficients in sparse representation provide additional structure wavelets: natural images pixels: background subtracted images

Sparse Signals Defn: A K-sparse signal lives on the collection of K-dim subspaces aligned with coord. axes

Structured Sparse Signals Defn: A K-structured sparse signal lives on a particular (reduced) collection of K-dim canonical subspaces [Lu and Do] [Blumensath and Davies]

Sparse Signal Ensemble Defn: An ensemble of J K-sparse signal lives on a collection of JK-dim subspaces aligned with coord. axes

Structured Sparse Signal Ensemble Defn: An structured ensemble of J K-sparse signals with common sparse support lives on a particular (reduced) collection of JK-dim canonical subspaces

RIP for Structured Sparsity Model Preserve the structure only of sparse signals that follow the structure Random (i.i.d. Gaussian, Bernoulli) matrix has the JSM-2 RIP with high probability if [Blumensath and Davies] X1 X2 m K JK-dim planes

RIP for Structured Sparsity Model Preserve the structure only of sparse signals that follow the structure Random (i.i.d. Gaussian, Bernoulli) matrix has the JSM-2 RIP with high probability if [Blumensath and Davies] X1 X2 m K JK-dim planes

RIP for Common Sparse Support Model Random (i.i.d. Gaussian, Bernoulli) matrix has the modelbased RIP with high probability if Distributed settings: measurements from different sensors can be added together to effectively obtain dense measurement matrix. X1 X2 m K JK-dim planes

Standard CS Recovery CoSaMP [Needell and Tropp] calculate current residual form residual signal estimate calculate enlarged support estimate signal for enlarged support shrink support

Model-Based CS Recovery Model-based CoSaMP : K-term structured sparse approximation algorithm calculate current residual form residual signal estimate calculate enlarged support estimate signal for enlarged support shrink support [Baraniuk, Cevher, Duarte, Hegde 2008]

Model-Based Recovery for JSM-2 Model-based Distributed CoSaMP CoSOMP calculate current residual at each sensor form residual signal estimate at each sensor r j = y j Φ j x j e j = Φ T j r j merge sensor estimates calculate enlarged support estimate signal proxy at each sensor merge sensor estimates shrink estimate support update signal estimates at each sensor e = J j=1 (e j e j ) Ω = supp(x) supp(t(e, 2K)) b j ω = Φ j Ω y j, b j ω C =0 b = J j=1 (b j b j ) Λ = supp(t(b,k)) x j Λ = b j Λ, x j Λ C =0

Model-Based CS Recovery Guarantees Theorem: Assume we obtain noisy CS measurements of a signal ensemble. If has the model-based RIP with, then we have CS recovery error signal K-term structured sparse approximation error noise In words, instance optimality based on structured sparse approximation

Model-Based CS Recovery Guarantees Theorem: Assume we obtain noisy CS measurements of a signal ensemble. If each Φ j has the RIP with, then we have CS recovery error signal K-term structured sparse approximation error noise In words, instance optimality based on structured sparse approximation

Real Data Example Environmental Sensing in Intel Berkeley Lab J = 49 sensors, N =1024 samples each Compare: independent recovery CoSaMP existing joint recovery SOMP model-based joint recovery CoSOMP

Experimental Results - Brightness Data (a) Original signal (c) CoSaMP recovery, distortion = 15.1733 db (d) CoSOMP recovery, distortion = 16.3197 db N = 1024, J = 48, M = 400

Experimental Results - Humidity Data 45 40 Average recovery distortion, db 35 30 25 20 15 10 CoSaMP 5 CoSOMP SOMP 0 0 100 200 300 400 500 Number of measurements, M N = 1024, J = 48

Experimental Results - Network Size Probability of exact recovery 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 1 2 4 8 16 32 CoSaMP CoSOMP 0 0 5 10 15 20 25 Number of measurements, M N = 1024

Conclusions Intuitive union-of-subspaces model to encode structure of jointly sparse signal ensembles Structure enables reduction in number of measurements required for recovery Signal recovery algorithms are easily adapted to leverage additional structure New structure-based recovery guarantees such as instance optimality dsp.rice.edu/cs