Compressed Sensing Using Bernoulli Measurement Matrices

Similar documents
Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Lecture 16: Compressed Sensing

Solution Recovery via L1 minimization: What are possible and Why?

Sparse Solutions of an Undetermined Linear System

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Estimating Unknown Sparsity in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Stochastic geometry and random matrix theory in CS

Compressed Sensing under Optimal Quantization

sparse and low-rank tensor recovery Cubic-Sketching

Strengthened Sobolev inequalities for a random subspace of functions

Elaine T. Hale, Wotao Yin, Yin Zhang

Reconstruction from Anisotropic Random Measurements

Compressed Sensing and Linear Codes over Real Numbers

arxiv: v1 [cs.it] 26 Oct 2018

Thresholds for the Recovery of Sparse Solutions via L1 Minimization

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

Single-letter Characterization of Signal Estimation from Linear Measurements

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Phase Transition Phenomenon in Sparse Approximation

On Source-Channel Communication in Networks

Distributed Source Coding Using LDPC Codes

Fundamental Limits of Compressed Sensing under Optimal Quantization

Compressive Sensing with Random Matrices

Reliable Computation over Multiple-Access Channels

Quantization for Distributed Estimation

MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing

Compressed Sensing with Shannon-Kotel nikov Mapping in the Presence of Noise

Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements

Sparse Optimization Lecture: Sparse Recovery Guarantees

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Constructing Explicit RIP Matrices and the Square-Root Bottleneck

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Signal Recovery from Permuted Observations

A new method on deterministic construction of the measurement matrix in compressed sensing

(Structured) Coding for Real-Time Streaming Communication

Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, and Vahid Tarokh, Fellow, IEEE

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes

Optimisation Combinatoire et Convexe.

Performance Analysis for Sparse Support Recovery

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Lossy Compression of Distributed Sparse Sources: a Practical Scheme

The Minimax Noise Sensitivity in Compressed Sensing

Uncertainity, Information, and Entropy

An Uplink-Downlink Duality for Cloud Radio Access Network

Invertibility of random matrices

Compressed Sensing and Related Learning Problems

A Structured Construction of Optimal Measurement Matrix for Noiseless Compressed Sensing via Polarization of Analog Transmission

Covariance Sketching via Quadratic Sampling

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Compressed Sensing and Sparse Recovery

Distributed Lossless Compression. Distributed lossless compression system

Greedy Signal Recovery and Uniform Uncertainty Principles

Solving Corrupted Quadratic Equations, Provably

Upper Bound for Intermediate Singular Values of Random Sub-Gaussian Matrices 1

Lecture 20: Quantization and Rate-Distortion

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Solution-recovery in l 1 -norm for non-square linear systems: deterministic conditions and open questions

An algebraic perspective on integer sparse recovery

A Truncated Prediction Framework for Streaming over Erasure Channels

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Introduction to Compressed Sensing

Recent Developments in Compressed Sensing

COMPRESSED SENSING IN PYTHON

Robust Bayesian compressed sensing. over finite fields: asymptotic performance analysis

Compressive Sampling for Energy Efficient Event Detection

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

A Comparison of Two Achievable Rate Regions for the Interference Channel

Combining geometry and combinatorics

SIGNALS with sparse representations can be recovered

Performance Trade-Offs in Multi-Processor Approximate Message Passing

Consider the following example of a linear system:

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 6, JUNE

Signal Recovery, Uncertainty Relations, and Minkowski Dimension

Gaussian Phase Transitions and Conic Intrinsic Volumes: Steining the Steiner formula

Lecture 4 Noisy Channel Coding

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

Tractable Upper Bounds on the Restricted Isometry Constant

Optimal Deterministic Compressed Sensing Matrices

The Secrecy of Compressed Sensing Measurements

X 1 : X Table 1: Y = X X 2

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008

Sparse Recovery with Pre-Gaussian Random Matrices

6 Compressed Sensing and Sparse Recovery

of Orthogonal Matching Pursuit

On Gaussian MIMO Broadcast Channels with Common and Private Messages

Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery

SPARSE signal processing has recently been exploited in

Algorithms for sparse analysis Lecture I: Background on sparse approximation

Journal of Chemical and Pharmaceutical Research, 2016, 8(4): Research Article

The Sparsest Solution of Underdetermined Linear System by l q minimization for 0 < q 1

Homework Set #2 Data Compression, Huffman code and AEP

Transcription:

ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada

Motivation Motivation Sparse signal recovery in wireless sensor networks for the applications such as sensor perception has draw a vast of attention. Processors using in wireless sensor networks have limit power and computation capability. In practice, the matrices used for sampling are generated by a pseudorandom number generator with finite precision.

Motivation Configuration of CS in wireless sensor networks What is the information-theoretic limits of lossless sparse recovery using Bernoulli ±1 matrices?

Random Sparse Signals Definition X = S U where S is a Bernoulli random variable with success probability ρ (0 < ρ < 1/2) and U is a absolutely continuous random variable. Definition A pair of signals (X, Y ) = (S X U X, S Y U Y ) are said to be jointly sparse if (S X, S Y ) are i.i.d generated from p(s x, s y ) with (s x, s y ) {0, 1} {0, 1}, satisfying: X and Y are sparse with sparsity rate p(s x = 1) and p(s y = 1) respectively, p(s x = 1 s y = 1) < 1/2 and p(s y = 1 s x = 1) < 1/2.

Problem Formulation Definition A sparse recovery scheme associated with sparse signal vector X n R n is a pair C n = (f, g), which consists of an encoder map, f : R n R m, and a decoder map, g : R m X n. and encoding map f is matrix multiplication, i.e. f (x n ) = Ax n. Definition A distributed sparse recovery scheme associated with jointly sparse signal vectors X n X n and Y n Y n is a triple C n = (f 1, f 2, g), which consists of two encoder maps, and a decoder map, f 1 : X n F m 1, f 2 : Y n G m 2, g : F m 1 G m 2 X n Y n.

Single Sparse Recovery: Upper Bound Theorem Let random signal vector X n be sparse with respect to sparsity rate ρ, then for any small ɛ > 0 and n large enough, there exits a sparse recovery scheme C n = (A, g) satisfies P e (C n ) ɛ, provided sampling rate ( 2 R(C n ) 1 o(1) h 2 (ρ) + 2 log ɛ ) + δ ɛ n where δ ɛ 0 as ɛ 0.

Single Sparse Recovery: Lower Bound Theorem Suppose that memoryless signal X is sparse with sparsity rate ρ > 0. For any random sparse recovering scheme C n = (A, g) with sampling rate R X, if the decoding error probability P e (C n ) 0 as n, R X must satisfy R X 2h 2 (ρ) log θ(u) + log e where θ(u) 1 only depends on the distribution of U

Distributed Sparse Recovery: Asymptotic Bounds Theorem Suppose that memoryless signals (X, Y ) are jointly sparse. Consider a distributed sparse recovery scheme C n = (A 1, A 2, g) with sampling rates R X and R Y. If the average error probability P e (C n ) 0 as n, the sampling rates satisfy 2H(S X S Y ) log e R X 2H(S X S Y ) 2H(S Y S X ) log e R Y 2H(S Y S X ) 2H(S X,S Y ) log e R X + R Y 2H(S X, S Y )

Comparison result with Gaussian Sampling Matrices Figure: Sampling rates comparison Gaussian vs. Bernoulli for ρ = log n n

Conclusions and Reference Conclusions: Information theoretic limits on compressed sensing using Bernoulli sign matrices Deep insights gained on designing practical sparse recovery schemes in wireless sensor networks Reference: J. Bourgain, V. H. Vu and P. M. Wood, On the probability that a discrete random matrix is singular, Journal of Functional Analysis, 258(2), 559-603, Jan, 2010. G. Reeves and M. Gastpar, Compressed Compressed Sensing, 2010 ISIT, June, Austin, Texas, US. D. L. Donoho, Compressed Sensing, Trans. on IT, vol. 52, 1289-1306, April, 2006.