ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada
Motivation Motivation Sparse signal recovery in wireless sensor networks for the applications such as sensor perception has draw a vast of attention. Processors using in wireless sensor networks have limit power and computation capability. In practice, the matrices used for sampling are generated by a pseudorandom number generator with finite precision.
Motivation Configuration of CS in wireless sensor networks What is the information-theoretic limits of lossless sparse recovery using Bernoulli ±1 matrices?
Random Sparse Signals Definition X = S U where S is a Bernoulli random variable with success probability ρ (0 < ρ < 1/2) and U is a absolutely continuous random variable. Definition A pair of signals (X, Y ) = (S X U X, S Y U Y ) are said to be jointly sparse if (S X, S Y ) are i.i.d generated from p(s x, s y ) with (s x, s y ) {0, 1} {0, 1}, satisfying: X and Y are sparse with sparsity rate p(s x = 1) and p(s y = 1) respectively, p(s x = 1 s y = 1) < 1/2 and p(s y = 1 s x = 1) < 1/2.
Problem Formulation Definition A sparse recovery scheme associated with sparse signal vector X n R n is a pair C n = (f, g), which consists of an encoder map, f : R n R m, and a decoder map, g : R m X n. and encoding map f is matrix multiplication, i.e. f (x n ) = Ax n. Definition A distributed sparse recovery scheme associated with jointly sparse signal vectors X n X n and Y n Y n is a triple C n = (f 1, f 2, g), which consists of two encoder maps, and a decoder map, f 1 : X n F m 1, f 2 : Y n G m 2, g : F m 1 G m 2 X n Y n.
Single Sparse Recovery: Upper Bound Theorem Let random signal vector X n be sparse with respect to sparsity rate ρ, then for any small ɛ > 0 and n large enough, there exits a sparse recovery scheme C n = (A, g) satisfies P e (C n ) ɛ, provided sampling rate ( 2 R(C n ) 1 o(1) h 2 (ρ) + 2 log ɛ ) + δ ɛ n where δ ɛ 0 as ɛ 0.
Single Sparse Recovery: Lower Bound Theorem Suppose that memoryless signal X is sparse with sparsity rate ρ > 0. For any random sparse recovering scheme C n = (A, g) with sampling rate R X, if the decoding error probability P e (C n ) 0 as n, R X must satisfy R X 2h 2 (ρ) log θ(u) + log e where θ(u) 1 only depends on the distribution of U
Distributed Sparse Recovery: Asymptotic Bounds Theorem Suppose that memoryless signals (X, Y ) are jointly sparse. Consider a distributed sparse recovery scheme C n = (A 1, A 2, g) with sampling rates R X and R Y. If the average error probability P e (C n ) 0 as n, the sampling rates satisfy 2H(S X S Y ) log e R X 2H(S X S Y ) 2H(S Y S X ) log e R Y 2H(S Y S X ) 2H(S X,S Y ) log e R X + R Y 2H(S X, S Y )
Comparison result with Gaussian Sampling Matrices Figure: Sampling rates comparison Gaussian vs. Bernoulli for ρ = log n n
Conclusions and Reference Conclusions: Information theoretic limits on compressed sensing using Bernoulli sign matrices Deep insights gained on designing practical sparse recovery schemes in wireless sensor networks Reference: J. Bourgain, V. H. Vu and P. M. Wood, On the probability that a discrete random matrix is singular, Journal of Functional Analysis, 258(2), 559-603, Jan, 2010. G. Reeves and M. Gastpar, Compressed Compressed Sensing, 2010 ISIT, June, Austin, Texas, US. D. L. Donoho, Compressed Sensing, Trans. on IT, vol. 52, 1289-1306, April, 2006.