Support recovery in compressed sensing: An estimation theoretic approach

Similar documents
Randomized Recovery for Boolean Compressed Sensing

A Simple Regression Problem

Block designs and statistics

Compressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements

Non-Parametric Non-Line-of-Sight Identification 1

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion

Compressive Sensing Over Networks

Sharp Time Data Tradeoffs for Linear Inverse Problems

Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links

Fixed-to-Variable Length Distribution Matching

Machine Learning Basics: Estimators, Bias and Variance

Physics 215 Winter The Density Matrix

ESE 523 Information Theory

TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES

Lower Bounds for Quantized Matrix Completion

On the theoretical analysis of cross validation in compressive sensing

Recovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup)

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

Weighted- 1 minimization with multiple weighting sets

Detection and Estimation Theory

Optimal Jamming Over Additive Noise: Vector Source-Channel Case

On Constant Power Water-filling

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics

Weighted Superimposed Codes and Constrained Integer Compressed Sensing

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,

A Note on the Applied Use of MDL Approximations

List Scheduling and LPT Oliver Braun (09/05/2017)

Approximation in Stochastic Scheduling: The Power of LP-Based Priority Policies

Recovery of Sparsely Corrupted Signals

Feature Extraction Techniques

Correlated Bayesian Model Fusion: Efficient Performance Modeling of Large-Scale Tunable Analog/RF Integrated Circuits

Kernel Methods and Support Vector Machines

Least Squares Fitting of Data

1 Rademacher Complexity Bounds

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

Highly Robust Error Correction by Convex Programming

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013).

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

A note on the multiplication of sparse matrices

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Support recovery in compressive sensing for estimation of Direction-Of-Arrival

Polygonal Designs: Existence and Construction

Effective joint probabilistic data association using maximum a posteriori estimates of target states

Hamming Compressed Sensing

On Conditions for Linearity of Optimal Estimation

A Low-Complexity Congestion Control and Scheduling Algorithm for Multihop Wireless Networks with Order-Optimal Per-Flow Delay

Combining Classifiers

ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD

An RIP-based approach to Σ quantization for compressed sensing

A Simple Homotopy Algorithm for Compressive Sensing

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution

3.8 Three Types of Convergence

On the Use of A Priori Information for Sparse Signal Approximations

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

An Adaptive UKF Algorithm for the State and Parameter Estimations of a Mobile Robot

Testing equality of variances for multiple univariate normal populations

A PROBABILISTIC AND RIPLESS THEORY OF COMPRESSED SENSING. Emmanuel J. Candès Yaniv Plan. Technical Report No November 2010

In this chapter, we consider several graph-theoretic and probabilistic models

Lecture October 23. Scribes: Ruixin Qiang and Alana Shine

Proc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES

Topic 5a Introduction to Curve Fitting & Linear Regression

A general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics, EPFL, Lausanne Phone: Fax:

1 Proof of learning bounds

CS Lecture 13. More Maximum Likelihood

A Probabilistic and RIPless Theory of Compressed Sensing

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Bipartite subgraphs and the smallest eigenvalue

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

Boosting with log-loss

Multivariate Methods. Matlab Example. Principal Components Analysis -- PCA

A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS

Design of Spatially Coupled LDPC Codes over GF(q) for Windowed Decoding

Asynchronous Gossip Algorithms for Stochastic Optimization

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

INNER CONSTRAINTS FOR A 3-D SURVEY NETWORK

COS 424: Interacting with Data. Written Exercises

arxiv: v1 [cs.ds] 17 Mar 2016

STOPPING SIMULATED PATHS EARLY

Lecture 20 November 7, 2013

3.3 Variational Characterization of Singular Values

Biostatistics Department Technical Report

LARGE DEVIATIONS AND RARE EVENT SIMULATION FOR PORTFOLIO CREDIT RISK

Interactive Markov Models of Evolutionary Algorithms

Fairness via priority scheduling

Kernel-Based Nonparametric Anomaly Detection

Can the Threshold Performance of Maximum Likelihood DOA Estimation be Improved by Tools from Random Matrix Theory?

Understanding Machine Learning Solution Manual

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

Antenna Saturation Effects on MIMO Capacity

Fundamental Limits of Database Alignment

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis

Chapter 6 1-D Continuous Groups

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters

arxiv: v5 [cs.it] 16 Mar 2012

Bayes Decision Rule and Naïve Bayes Classifier

ANALYSIS OF HALL-EFFECT THRUSTERS AND ION ENGINES FOR EARTH-TO-MOON TRANSFER

Tail estimates for norms of sums of log-concave random vectors

An Improved Particle Filter with Applications in Ballistic Target Tracking

PULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE

Transcription:

Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland. Abstract Copressed sensing CS deals with the reconstruction of sparse signals fro a sall nuber of linear easureents. One of the ain challenges in CS is to find the support of a sparse signal fro a set of noisy observations. In the CS literature, several inforation-theoretic bounds on the scaling law of the required nuber of easureents for exact support recovery have been derived, where the focus is ainly on rando easureent atrices. In this paper, we investigate the support recovery proble fro an estiation theory point of view, where no specific assuption is ade on the underlying easureent atrix. By using the Haersley-Chapan-Robbins HCR bound, we derive a fundaental lower bound on the perforance of any unbiased estiator which provides necessary conditions for reliable l - nor support recovery. We then analyze the optial decoder to provide conditions under which the HCR bound is achievable. This leads to a set of sufficient conditions for reliable l -nor support recovery. I. INTRODUCTION Linear sapling of sparse signals, with a nuber of saples close to their sparsity level, has recently received great attention under the nae of Copressed Sensing or Copressive Sapling CS [1], []. A -sparse signal θ R p is defined as a signal with p nonzero expansion coefficients in soe orthonoral basis or frae. The goal of copressed sensing is to find easureent atrices Φ p, followed by reconstruction algoriths which allow robust recovery of sparse signals using the least nuber of easureents, and low coputational coplexity. In practice, however, all the easureents are noisy, and thus the exact recovery of θ is ipossible. Support recovery refers to the proble of correctly estiating the position of the non-zero entries based on a set of noisy observations. A large body of recent wor e.g., [3], [4], [5], [6] has established inforation theoretic liits for exact support recovery based on the {0, 1} valued loss function. This wor ainly focuses on the standard Gaussian easureent enseble where the eleents of the easureent atrix are drawn i.i.d fro the Gaussian distribution N 0, 1. In this paper, we loo at the support recovery proble fro an estiation theory point of view, where the error etric between the true and the estiated support is the l -nor. The positions of the nonzero entries of θ fors a set of integers between 1 and p. Consequently, the support recovery in a discrete setup can be regarded as estiating restricted paraeters. This leads us to use the Haersley-Chapan- Robbins HCR bound which provides a lower bound on the variance of any unbiased estiator of a set of restricted paraeters [7], [8]. The organization of this paper is as follows. In Section II, we provide a ore precise forulation of the proble. We derive the HCR bound for the support recovery proble in Section III, where no assuption is ade on the easureent atrix. We then apply the obtained bound on rando easureent atrices, in order to deterine a lower bound on the nuber of easureents for reliable l -nor support recovery. Of equal interest are the conditions under which the derived HCR bound is achievable. To this end, in Section IV, we study the perforance of the Maxiu-Lielihood decoder and derive conditions under which it becoes unbiased and achieves the HCR bound. Again, no assuption is ade on the easureent atrix. Using the Gaussian easureent enseble, as an exaple, we can then identify the sufficient nuber of easureents for reliable l -nor support recovery. II. PROBLEM STATEMENT In this paper, we consider a deterinistic signal odel in which θ R p is a fixed but unnown vector with exactly non-zero entries. We refer to as the signal sparsity, p as the signal diension, and define the support vector as the positions of the non-zero eleents of θ. More precisely, sθ n 1, n,..., n, 1 where the corresponding non-zero entries of θ are θ s θ n1, θ n,..., θ n. We assue that n 1 < n < < n. Suppose we are given a vector of noisy observations y R of the for y = Φθ + ɛ, 3 where Φ R p is the easureent atrix, and ɛ N 0, σ I is additive Gaussian noise. Throughout this paper, we assue w.l.o.g that σ is fixed, since any scaling of σ can be acounted for in the scaling of θ. Let x = Φθ, and Φ s denote the subspace spanned by the coluns of Φ at positions indexed by sθ. Since there are N = p subspaces of diension, a nuber fro 1 to N can be assigned to the and w.l.o.g., we assue that x belongs to the first subspace s 1 = s. Due to the presence of noise, θ cannot be recovered exactly. However, a sparse-recovery algorith outputs an estiate θ. In the support recovery proble, we are only interested in

estiating the support. To that end, we can consider different perforance etrics for the estiate. In [6], the easure of error between the estiate and the true signal is a {0, 1} valued loss function: ρ 1 θ, θ = I sθ sθ, 4 where I is the indicator function. This etric is appropriate for the exact support recovery. In this wor, we are interested in an approxiate support recovery. For this purpose, we consider the following l -nor error etric ρ θ, θ = sθ sθ. 5 Note that ρ θ, θ = 0 iplies ρ 1 θ, θ = 0 and vice-versa. As was entioned in [6], the SNR is not suitable for the support recovery proble. It is possible to generate proble instances for which the support recovery is arbitrarily difficult, in particular, by sending the sallest coefficient to zero assuing that > 1 at an arbitrarily rapid rate, even as the SNR becoes arbitrarily large by increasing the rest. Hence, we also define θ in = in θ i. 6 i s In particular, our results apply to any unbiased decoder that operates over the signal class Cθ in = {θ R p : θ i θ in i s}. 7 With this setup, our goal is to find conditions for any unbiased estiator, based on the paraeters p,, and θ in, under which the variance of error for any signal piced fro the signal class Cθ in goes to zero as the signal diension increases. Our analysis is high diensional in nature, in the sense that the signal diension p goes to infinity. More precisely, we say the l -nor support recovery is reliable if li ρ θ, θ = 0, 8 for any θ Cθ in, under soe scaling of θ in,, as a function of p. For unbiased estiators, 8 is equivalent to li tr[covŝθ] = 0, 9 where ŝθ is the estiated support of θ. Since the support estiation is based on y, with abuse of notation, we also denote it by ŝy. Throughout this paper, we only consider unbiased estiators. III. HAMMERSLEY-CHAPMAN-ROBBINS BOUND The Craer-Rao CR bound is a well-nown tool in statistics which provides a lower bound on the variance of the error of any unbiased estiator of an unnown deterinistic paraeter δ, fro a set of easureents y [9]. More specifically, in a single paraeter scenario, the estiated value ˆδ satisfies 1 varˆδ ln Py;δ δ Py; δdy, 10 where Py; δ is the pdf of the easureents which depends on the paraeter δ. As 10 suggests, the CR bound is typically derived for estiating a continuous paraeter. In any cases, there is a priori inforation on the estiated paraeter which restricts it to tae values fro a pre deterined set. An exaple is the estiation of the ean of a noral distribution when one nows that the true ean is an integer. In such scenarios, the Haersley-Chapan- Robbins HCR bound provides a stronger lower bound on the variance of any unbiased estiator [7], [8]. More specifically, let us assue that the set of independent observations y = y 1, y,..., y is drawn according to a probability distribution with density function Py; δ where δ is a paraeter belonging to soe paraeter set e.g., the set of integer nubers and copletely characterizes the pdf. In addition, the sequence δ is partitioned into two subsequences δ = δ 1, δ where we are only interested in estiating the paraeters included in subsequence δ 1. Let ˆδ 1 y denote an unbiased estiator of δ 1. The HCR bound on the trace of the covariance atrix of any unbiased estiator of δ 1 is given by tr[covˆδ 1 ] sup δ δ δ 1 δ 1 11 P R y;δ Py;δ dy 1, in which δ = δ 1, δ. The set is chosen so that δ taes values according to the a priori inforation. Exaple 3.1: For clarity, let us consider the perforance of an unbiased estiator of the ean of a noral distribution based on independent saples of size, i.e. y = y 1, y,..., y. In this case, δ = µ, σ, δ 1 = µ, δ = σ and Py; δ = π n/ σ n e 1 P σ i=1 yi µ. 1 Let ˆµy denote an unbiased estiator of µ, the paraeter we want to estiate. When there is no prior inforation on µ, it follows fro the CR bound that varˆµ σ /. 13 Once the ean is restricted to be an integer, we ay write δ 1 = µ and δ 1 = µ + α where α is a non-zero integer. Then upon integration we get α varˆµ ax 14 α 0 e α /σ 1 1 = e /σ 1, 15 where the axiu is attained for α = ±1. A point worth entioning is the role of prior inforation. While 13 drops linearly, 15 decreases exponentially with respect to the nuber of observations. It is also interesting to note that 14 applies as well to the case in which the paraeter is not restricted. We then have to deal with the axiization in 14 for variations in α where α ay tae any value not necessarily integral except α = 0. Since the RHS of 14 is a decreasing function of α, we let α 0 and we deduce 13. In the support recovery proble, we now a priori that each entry of the support vector taes values fro the restricted set = {1,,..., p}. Hence the HCR bound can provide us with a lower bound on the perforance of any unbiased estiator.

Theore 3.: Assue ŝy to be an unbiased estiator of the support s. The HCR lower bound on the variance of ŝy is given by ax i {,,N} s s i e x ps i x /σ 1, 16 in which p si x denotes the projection of x onto the subspace spanned by Φ si. oof: Since our observations are of the for y = Φθ+ɛ, the set of unnown paraeters δ consists of the support vector sθ = n 1, n,..., n and the corresponding coefficients θ s = θ n1, θ n,..., θ n. We are only interested in estiating the support, hence, δ 1 = sθ and δ = θ s. Then P y; δ Py; δ = 1 e y i x πσ i=1 i +x i x i x i σ, 17 where x = Φθ. Upon integration we get P y; δ x x dy 1 = e σ R Py; δ 1. 18 Using the HCR bound s s sup δ δ e x x /σ 1. 19 If x and x live in the sae subspace, i.e., s = s, the RHS of 19 will be zero. Therefore, in order to find the supreu, we can restrict our attention to all the signals which do not live in the sae subspace as x does: sup {θ :sθ sθ} s s e x x /σ 1. 0 For each sequence s, the nuerator of 0 is fixed it is the l distance between the supports and does not depend on the coefficients while the denoinator is iniized by setting x = p s x. This leads to 16. In the following, we see how Theore 3. helps us find a lower bound on the nuber of easureents for reliable l - nor support recovery. A. Necessary Conditions Using the HCR bound, Theore 3. provides a lower bound on the perforance of any unbiased estiator for the l - nor support recovery proble. In words, the l -nor support recovery is unreliable if the RHS of 16 is bounded away fro zero which yields to a lower bound on the iniu nuber of easureents. The following exaple illustrates how this bound can be used when the Gaussian easureent atrices Φ are deployed. Rando Matrices: As an exaple, we obtain the necessary conditions on the nuber of easureents required for reliable l -nor support recovery, when each entry Φ ij is drawn i.i.d. fro a Gaussian distribution N 0, 1. Theore 3.3: Let the easureent atrix Φ R p be drawn with i.i.d. eleents fro a Gaussian distribution with zero-ean and variance one. Then the l -nor support recovery over the signal class Cθ in is unreliable if { } < ax, σ logp θin. 1 oof: Fro Theore 3. we now that for any x s s s e x x /σ 1. The l -nor support recovery is reliable if 8 holds for any θ Cθ in. In particular, when sθ = 1,,..., and it taes on θ in as its last non-zero entry, i.e., θ = θ in. Moreover, assue that θ is equal to θ on all the positions but the sallest non-zero value. Note that one can find a θ such that s s be at least p by siply choosing sθ = 1,,..., 1, p, i.e., putting the sallest non-zero entry of θ in the last position. Now This iplies that x x = Φθ θ. 3 x x σ = θ in Z, 4 σ where Z χ. Note that tr[covŝ] is bounded away fro zero if s s /e x x /σ 1 does not go to zero. This will happen if x x. < log p 1, 5 σ as p, where by A <. B we ean ultiplicatively less than B in asyptote, i.e., there exists a constant δ > 0 such that A 1 + δb. The expression 5 is equivalent to Z >. σ logp θin 0, 6 as p. It is nown that a centralized χ variate with degrees of freedo satisfies [ Z ] t e t, 7 for all t 0 [10]. Cobining 6 and 7 leads to «Z >. σ logp θin e σ logp θ in /4, 8 provided that < 1 + C σ logp θ in, 9 for soe constant C > 0 note that 7 is only valid for t 0. Clearly, under the condition 9, the right hand side of 8 tends to zero as p grows. Table I deonstrates the necessary conditions for different scalings of and θ in as a function of p. Up to this point discussed the HCR bound and its application in finding necessary conditions on the nuber of easureents for reliable l -nor support recovery. What reains is to find conditions under which the HCR bound is achievable which consequently provides us with the sufficient nuber of easureents for reliable l -nor support recovery.

IV. ACHIEVABILITY OF THE HCR BOUND We now analyze the perforance of the Maxiu- Lielihood estiator for the l -nor support recovery and find conditions under which it becoes unbiased and in addition, its perforance oves towards that of the HCR bound. ovided that any coluns of the easureent atrix Φ are linearly independent, the noiseless easureent vector x = Φθ belongs to one and only one of the N possible subspaces. Since the noise ɛ R is i.i.d. Gaussian, the estiator selects the subspace closest to the observed vector y R. More precisely, ŝ = argin s: s = y p s y. 30 Now, consider another subspace Φ s, of diension where s s. Clearly an error happens, when selects the support s in place of the true support s. Let s denote the probability that selects the subspace s instead of s aong all the subspaces. Lea 4.1: Let y = x + ɛ, where x = Φθ Φ s, ɛ N 0, σ I and s be a support sequence different fro s. Then s < ɛ x p s x. 31 oof: chooses s over s if and only if in y t < in y t. 3 t Φ s t Φ s Let us assue that ɛ < x p s x /. Then, for any t Φ s, y t = x t + ɛ ɛ + x t x t ɛ a > ɛ = y x in t s y t, where in a, we used the entioned assuption. This iplies that if ɛ x p s x /, the estiator will not choose s over s. Since the probability that the estiator pics s instead of s is less than the probability that it prefers s to s, we get 31. Lea 4.: Let the nuber of easureents, be even. Then / 1 s < e r/ r/ t, 33 where r = x p s x 4σ. oof: By Lea 4.1 s < 1 ɛ σ < r. The rando variable ɛ σ is distributed according to the chisquare distribution with degrees of freedo. By using the cdf of the chi-square distribution, we obtain s < 1 γ/, r/, 34 Γ/ where Γ, is the Gaa function and γ, x, is the lower incoplete Gaa function. It is easy to show that for an even nuber, γ/, r/ Γ/ = e r/ r/ t. t= Since by the Taylor expansion e r/ = r/ t, we obtain γ/, r/ Γ/ / 1 = 1 e r/ r/ t. 35 Cobining 34 and 35 will lead to the lea. Lea 4.3: Let r = α for soe constant α > 1. Then s < r α cα r, 36 in which cα = e α 1/α /α 1/α > 1 and cα e as α grows. oof: Note that for t < r, the function ft = r t / is strictly increasing. By observing that 1 < r, and eploying Lea 4. we get: s < e r/ 1 r/ t < e r/ r/ / /! a < e r/ r/ / /e / = r e α 1/α r, α α 1/α where in a we used the inequality! > /e. It can be verified that cα > 1 for α > 1. Although we do not prove it here, it is not hard to see that the upper bound shows a linear decay for α 1. In the following theore, we provide an upper bound on the perforance of the estiator. Based on Lea 4.1, the probability of error is related to the iniu distance between x and its projections onto the other subspaces. Let d in in x p s x, β = s :s s d in /4σ and r in = β. Theore 4.4: For β > 1, the perforance of the estiator is upper bounded as tr[covŝ] < p cβ rin. 37 oof: By Lea 4.1, we now that if ɛ < d in /, aes the correct choice. Therefore, fro Lea 4.3, we obtain err < r in β cβ rin, 38

Since s s i < p and err = N i= s i, we obtain tr[covŝ] = N i= s i s s i < p r in β cβ rin = p cβ rin. In general, the estiator can be biased and its perforance cannot be copared with unbiased estiators. The following theore provides us with the condition under which it becoes unbiased. Theore 4.5: Under the conditions 1 + ε log p/β log cβ, for soe fixed ε > 0 and β bounded away fro 1, the estiator is asyptotically unbiased as p. oof: Let ŝ = ˆn 1, ˆn,..., ˆn be the estiate for the true support set s = n 1, n,..., n. Then Eŝ = N i=1 ŝ i ŝ i = s s + ŝ i ŝ i. Since ŝ i s ŝ i = err and 1 ˆn i p for 1 i, ŝ i ŝ i p, p,..., p err. Since β > 1, fro 38 we get li ŝ i ŝ i li p, p,..., p cβ β a = 0, where in a, we used 1 + ε log p/β log cβ. Obviously, s 1 as p. Hence Eŝ = s. As we observe, our results do not depend on any specific easureent atrix. On the one hand, Theore 4.4 provides us with an upper bound on the error of the estiator. On the other hand, since by Theore 4.5, the estiator is unbiased under the entioned conditions, its estiation error is lower bounded by the HCR bound, which shows a 9 db gap in the denoinator with the upper bound. Therefore, such asyptotic behavior of the estiator, shows the achievability of the HCR bound, under the entioned conditions. In the following, we see how Theore 4.4 leads to find the sufficient nuber of easureents for reliable l -nor support recovery where the Gaussian easureent enseble is used. A. Sufficient Conditions Theore 4.4 provides us with an upper bound on the perforance of the estiator. For reliable l -nor support recovery, the RHS of 37 should go to zero as p. To that end, one should ae sure that first, β is bounded away fro one, which is a property of the underlying easureent Necessary conditions Sufficient conditions = Θp θin = Θ ` 1 Θ p log p = Θp θin Θp Θp = Θ 1 = op θin = Θ ` 1 Θ logp = op θin Θ Θ ` log p = Θ 1 TABLE I NECESSARY AND SUFFICIENT CONDITIONS ON THE NUMBER OF MEASUREMENTS REQUIRED FOR RELIABLE l -NORM SUPPORT RECOVERY. atrix and second, that the nuber of easureents is at least of the order of log p which assures that the estiator is unbiased. Theore 4.6: Let the easureent atrix Φ be drawn with i.i.d. eleents fro a Gaussian distribution N 0, 1. If the iniu value reains constant eaning θ in = Θ1, then = Θ log p nuber of easureents suffices to ensure reliable l -nor support recovery. oof: The proof follows the sae lines as [6] to show that both entioned conditions are siultaneously satisfied. We refer the reader to our technical report [11] for details. Sufficient conditions in different regies are shown in Table I. ACKNOWLEDGMENT The authors would lie to than of. M. J. Wainwright for his help and any useful coents. REFERENCES [1] D. L. Donoho, Copressed sensing, IEEE Trans. Infor. Theory, vol. 5, no. 4, pp. 189 1306, Apr. 006. [] E.J. Candes, J. Roberg, and T. Tao, Robust uncertainty principles: exact signal reconstruction fro highly incoplete frequency inforation, IEEE Trans. Infor. Theory, vol. 5, no., pp. 489 509, Feb. 006. [3] M. Acaaya and V. Taroh, Shannon theoretic liits on noisy copressive sapling, Tech. Rep. arxiv:cs.it:0711.0366, Harvard Univ., Noveber 007. [4] A. K. Fletcher, S. Rangan, and V. K. Goyal, Necessary and sufficient conditions on sparsity pattern recovery, Tech. Rep. arxiv:cs.it:0804.1839, UC Bereley, April 008. [5] S. Sarvotha, D. Baron, and R. G. Baraniu, Measureents versus bits: Copressed sensing eets inforation theory, in oc. Allerton Conference on Control, Counication and Coputing, Septeber 006. [6] M. J. Wainwright, Inforation-theoretic bounds for sparsity recovery: Dense versus sparse easureents, Tech. Rep. 76, Departent of Statistics, UC Bereley, April 008. [7] J. M. Haersley, On estiating restricted paraeters, J. Roy. Stat. Soc., vol. 1, no., pp. 19 40, 1950. [8] D.G. Chapan and H. Robbins, Miniu variance estiation without regularity assuption, Annals Math. Stat., vol., no. 4, pp. 581 586, 1951. [9] P. Stoica and R. Moses, Introduction to Spectral Analysis, entice-hall, 000. [10] B. Laurent and P. Massart, Adaptive estiation of a quadratic functional by odel selection, Annals of Statistics, vol. 8, no. 5, pp. 1303 1338, 1998. [11] A. Karbasi, A. Horati, S. Mohajer, R. Urbane, and M. Vetterli, Support recovery in copressed sensing: An estiation theory approach, Tech. Rep., EPFL, January 008, Available at http://infoscience.epfl.ch.