Support recovery in compressed sensing: An estimation theoretic approach
|
|
- Sara Higgins
- 5 years ago
- Views:
Transcription
1 Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland. Abstract Copressed sensing CS deals with the reconstruction of sparse signals fro a sall nuber of linear easureents. One of the ain challenges in CS is to find the support of a sparse signal fro a set of noisy observations. In the CS literature, several inforation-theoretic bounds on the scaling law of the required nuber of easureents for exact support recovery have been derived, where the focus is ainly on rando easureent atrices. In this paper, we investigate the support recovery proble fro an estiation theory point of view, where no specific assuption is ade on the underlying easureent atrix. By using the Haersley-Chapan-Robbins HCR bound, we derive a fundaental lower bound on the perforance of any unbiased estiator which provides necessary conditions for reliable l - nor support recovery. We then analyze the optial decoder to provide conditions under which the HCR bound is achievable. This leads to a set of sufficient conditions for reliable l -nor support recovery. I. INTRODUCTION Linear sapling of sparse signals, with a nuber of saples close to their sparsity level, has recently received great attention under the nae of Copressed Sensing or Copressive Sapling CS [1], []. A -sparse signal θ R p is defined as a signal with p nonzero expansion coefficients in soe orthonoral basis or frae. The goal of copressed sensing is to find easureent atrices Φ p, followed by reconstruction algoriths which allow robust recovery of sparse signals using the least nuber of easureents, and low coputational coplexity. In practice, however, all the easureents are noisy, and thus the exact recovery of θ is ipossible. Support recovery refers to the proble of correctly estiating the position of the non-zero entries based on a set of noisy observations. A large body of recent wor e.g., [3], [4], [5], [6] has established inforation theoretic liits for exact support recovery based on the {0, 1} valued loss function. This wor ainly focuses on the standard Gaussian easureent enseble where the eleents of the easureent atrix are drawn i.i.d fro the Gaussian distribution N 0, 1. In this paper, we loo at the support recovery proble fro an estiation theory point of view, where the error etric between the true and the estiated support is the l -nor. The positions of the nonzero entries of θ fors a set of integers between 1 and p. Consequently, the support recovery in a discrete setup can be regarded as estiating restricted paraeters. This leads us to use the Haersley-Chapan- Robbins HCR bound which provides a lower bound on the variance of any unbiased estiator of a set of restricted paraeters [7], [8]. The organization of this paper is as follows. In Section II, we provide a ore precise forulation of the proble. We derive the HCR bound for the support recovery proble in Section III, where no assuption is ade on the easureent atrix. We then apply the obtained bound on rando easureent atrices, in order to deterine a lower bound on the nuber of easureents for reliable l -nor support recovery. Of equal interest are the conditions under which the derived HCR bound is achievable. To this end, in Section IV, we study the perforance of the Maxiu-Lielihood decoder and derive conditions under which it becoes unbiased and achieves the HCR bound. Again, no assuption is ade on the easureent atrix. Using the Gaussian easureent enseble, as an exaple, we can then identify the sufficient nuber of easureents for reliable l -nor support recovery. II. PROBLEM STATEMENT In this paper, we consider a deterinistic signal odel in which θ R p is a fixed but unnown vector with exactly non-zero entries. We refer to as the signal sparsity, p as the signal diension, and define the support vector as the positions of the non-zero eleents of θ. More precisely, sθ n 1, n,..., n, 1 where the corresponding non-zero entries of θ are θ s θ n1, θ n,..., θ n. We assue that n 1 < n < < n. Suppose we are given a vector of noisy observations y R of the for y = Φθ + ɛ, 3 where Φ R p is the easureent atrix, and ɛ N 0, σ I is additive Gaussian noise. Throughout this paper, we assue w.l.o.g that σ is fixed, since any scaling of σ can be acounted for in the scaling of θ. Let x = Φθ, and Φ s denote the subspace spanned by the coluns of Φ at positions indexed by sθ. Since there are N = p subspaces of diension, a nuber fro 1 to N can be assigned to the and w.l.o.g., we assue that x belongs to the first subspace s 1 = s. Due to the presence of noise, θ cannot be recovered exactly. However, a sparse-recovery algorith outputs an estiate θ. In the support recovery proble, we are only interested in
2 estiating the support. To that end, we can consider different perforance etrics for the estiate. In [6], the easure of error between the estiate and the true signal is a {0, 1} valued loss function: ρ 1 θ, θ = I sθ sθ, 4 where I is the indicator function. This etric is appropriate for the exact support recovery. In this wor, we are interested in an approxiate support recovery. For this purpose, we consider the following l -nor error etric ρ θ, θ = sθ sθ. 5 Note that ρ θ, θ = 0 iplies ρ 1 θ, θ = 0 and vice-versa. As was entioned in [6], the SNR is not suitable for the support recovery proble. It is possible to generate proble instances for which the support recovery is arbitrarily difficult, in particular, by sending the sallest coefficient to zero assuing that > 1 at an arbitrarily rapid rate, even as the SNR becoes arbitrarily large by increasing the rest. Hence, we also define θ in = in θ i. 6 i s In particular, our results apply to any unbiased decoder that operates over the signal class Cθ in = {θ R p : θ i θ in i s}. 7 With this setup, our goal is to find conditions for any unbiased estiator, based on the paraeters p,, and θ in, under which the variance of error for any signal piced fro the signal class Cθ in goes to zero as the signal diension increases. Our analysis is high diensional in nature, in the sense that the signal diension p goes to infinity. More precisely, we say the l -nor support recovery is reliable if li ρ θ, θ = 0, 8 for any θ Cθ in, under soe scaling of θ in,, as a function of p. For unbiased estiators, 8 is equivalent to li tr[covŝθ] = 0, 9 where ŝθ is the estiated support of θ. Since the support estiation is based on y, with abuse of notation, we also denote it by ŝy. Throughout this paper, we only consider unbiased estiators. III. HAMMERSLEY-CHAPMAN-ROBBINS BOUND The Craer-Rao CR bound is a well-nown tool in statistics which provides a lower bound on the variance of the error of any unbiased estiator of an unnown deterinistic paraeter δ, fro a set of easureents y [9]. More specifically, in a single paraeter scenario, the estiated value ˆδ satisfies 1 varˆδ ln Py;δ δ Py; δdy, 10 where Py; δ is the pdf of the easureents which depends on the paraeter δ. As 10 suggests, the CR bound is typically derived for estiating a continuous paraeter. In any cases, there is a priori inforation on the estiated paraeter which restricts it to tae values fro a pre deterined set. An exaple is the estiation of the ean of a noral distribution when one nows that the true ean is an integer. In such scenarios, the Haersley-Chapan- Robbins HCR bound provides a stronger lower bound on the variance of any unbiased estiator [7], [8]. More specifically, let us assue that the set of independent observations y = y 1, y,..., y is drawn according to a probability distribution with density function Py; δ where δ is a paraeter belonging to soe paraeter set e.g., the set of integer nubers and copletely characterizes the pdf. In addition, the sequence δ is partitioned into two subsequences δ = δ 1, δ where we are only interested in estiating the paraeters included in subsequence δ 1. Let ˆδ 1 y denote an unbiased estiator of δ 1. The HCR bound on the trace of the covariance atrix of any unbiased estiator of δ 1 is given by tr[covˆδ 1 ] sup δ δ δ 1 δ 1 11 P R y;δ Py;δ dy 1, in which δ = δ 1, δ. The set is chosen so that δ taes values according to the a priori inforation. Exaple 3.1: For clarity, let us consider the perforance of an unbiased estiator of the ean of a noral distribution based on independent saples of size, i.e. y = y 1, y,..., y. In this case, δ = µ, σ, δ 1 = µ, δ = σ and Py; δ = π n/ σ n e 1 P σ i=1 yi µ. 1 Let ˆµy denote an unbiased estiator of µ, the paraeter we want to estiate. When there is no prior inforation on µ, it follows fro the CR bound that varˆµ σ /. 13 Once the ean is restricted to be an integer, we ay write δ 1 = µ and δ 1 = µ + α where α is a non-zero integer. Then upon integration we get α varˆµ ax 14 α 0 e α /σ 1 1 = e /σ 1, 15 where the axiu is attained for α = ±1. A point worth entioning is the role of prior inforation. While 13 drops linearly, 15 decreases exponentially with respect to the nuber of observations. It is also interesting to note that 14 applies as well to the case in which the paraeter is not restricted. We then have to deal with the axiization in 14 for variations in α where α ay tae any value not necessarily integral except α = 0. Since the RHS of 14 is a decreasing function of α, we let α 0 and we deduce 13. In the support recovery proble, we now a priori that each entry of the support vector taes values fro the restricted set = {1,,..., p}. Hence the HCR bound can provide us with a lower bound on the perforance of any unbiased estiator.
3 Theore 3.: Assue ŝy to be an unbiased estiator of the support s. The HCR lower bound on the variance of ŝy is given by ax i {,,N} s s i e x ps i x /σ 1, 16 in which p si x denotes the projection of x onto the subspace spanned by Φ si. oof: Since our observations are of the for y = Φθ+ɛ, the set of unnown paraeters δ consists of the support vector sθ = n 1, n,..., n and the corresponding coefficients θ s = θ n1, θ n,..., θ n. We are only interested in estiating the support, hence, δ 1 = sθ and δ = θ s. Then P y; δ Py; δ = 1 e y i x πσ i=1 i +x i x i x i σ, 17 where x = Φθ. Upon integration we get P y; δ x x dy 1 = e σ R Py; δ Using the HCR bound s s sup δ δ e x x /σ If x and x live in the sae subspace, i.e., s = s, the RHS of 19 will be zero. Therefore, in order to find the supreu, we can restrict our attention to all the signals which do not live in the sae subspace as x does: sup {θ :sθ sθ} s s e x x /σ 1. 0 For each sequence s, the nuerator of 0 is fixed it is the l distance between the supports and does not depend on the coefficients while the denoinator is iniized by setting x = p s x. This leads to 16. In the following, we see how Theore 3. helps us find a lower bound on the nuber of easureents for reliable l - nor support recovery. A. Necessary Conditions Using the HCR bound, Theore 3. provides a lower bound on the perforance of any unbiased estiator for the l - nor support recovery proble. In words, the l -nor support recovery is unreliable if the RHS of 16 is bounded away fro zero which yields to a lower bound on the iniu nuber of easureents. The following exaple illustrates how this bound can be used when the Gaussian easureent atrices Φ are deployed. Rando Matrices: As an exaple, we obtain the necessary conditions on the nuber of easureents required for reliable l -nor support recovery, when each entry Φ ij is drawn i.i.d. fro a Gaussian distribution N 0, 1. Theore 3.3: Let the easureent atrix Φ R p be drawn with i.i.d. eleents fro a Gaussian distribution with zero-ean and variance one. Then the l -nor support recovery over the signal class Cθ in is unreliable if { } < ax, σ logp θin. 1 oof: Fro Theore 3. we now that for any x s s s e x x /σ 1. The l -nor support recovery is reliable if 8 holds for any θ Cθ in. In particular, when sθ = 1,,..., and it taes on θ in as its last non-zero entry, i.e., θ = θ in. Moreover, assue that θ is equal to θ on all the positions but the sallest non-zero value. Note that one can find a θ such that s s be at least p by siply choosing sθ = 1,,..., 1, p, i.e., putting the sallest non-zero entry of θ in the last position. Now This iplies that x x = Φθ θ. 3 x x σ = θ in Z, 4 σ where Z χ. Note that tr[covŝ] is bounded away fro zero if s s /e x x /σ 1 does not go to zero. This will happen if x x. < log p 1, 5 σ as p, where by A <. B we ean ultiplicatively less than B in asyptote, i.e., there exists a constant δ > 0 such that A 1 + δb. The expression 5 is equivalent to Z >. σ logp θin 0, 6 as p. It is nown that a centralized χ variate with degrees of freedo satisfies [ Z ] t e t, 7 for all t 0 [10]. Cobining 6 and 7 leads to «Z >. σ logp θin e σ logp θ in /4, 8 provided that < 1 + C σ logp θ in, 9 for soe constant C > 0 note that 7 is only valid for t 0. Clearly, under the condition 9, the right hand side of 8 tends to zero as p grows. Table I deonstrates the necessary conditions for different scalings of and θ in as a function of p. Up to this point discussed the HCR bound and its application in finding necessary conditions on the nuber of easureents for reliable l -nor support recovery. What reains is to find conditions under which the HCR bound is achievable which consequently provides us with the sufficient nuber of easureents for reliable l -nor support recovery.
4 IV. ACHIEVABILITY OF THE HCR BOUND We now analyze the perforance of the Maxiu- Lielihood estiator for the l -nor support recovery and find conditions under which it becoes unbiased and in addition, its perforance oves towards that of the HCR bound. ovided that any coluns of the easureent atrix Φ are linearly independent, the noiseless easureent vector x = Φθ belongs to one and only one of the N possible subspaces. Since the noise ɛ R is i.i.d. Gaussian, the estiator selects the subspace closest to the observed vector y R. More precisely, ŝ = argin s: s = y p s y. 30 Now, consider another subspace Φ s, of diension where s s. Clearly an error happens, when selects the support s in place of the true support s. Let s denote the probability that selects the subspace s instead of s aong all the subspaces. Lea 4.1: Let y = x + ɛ, where x = Φθ Φ s, ɛ N 0, σ I and s be a support sequence different fro s. Then s < ɛ x p s x. 31 oof: chooses s over s if and only if in y t < in y t. 3 t Φ s t Φ s Let us assue that ɛ < x p s x /. Then, for any t Φ s, y t = x t + ɛ ɛ + x t x t ɛ a > ɛ = y x in t s y t, where in a, we used the entioned assuption. This iplies that if ɛ x p s x /, the estiator will not choose s over s. Since the probability that the estiator pics s instead of s is less than the probability that it prefers s to s, we get 31. Lea 4.: Let the nuber of easureents, be even. Then / 1 s < e r/ r/ t, 33 where r = x p s x 4σ. oof: By Lea 4.1 s < 1 ɛ σ < r. The rando variable ɛ σ is distributed according to the chisquare distribution with degrees of freedo. By using the cdf of the chi-square distribution, we obtain s < 1 γ/, r/, 34 Γ/ where Γ, is the Gaa function and γ, x, is the lower incoplete Gaa function. It is easy to show that for an even nuber, γ/, r/ Γ/ = e r/ r/ t. t= Since by the Taylor expansion e r/ = r/ t, we obtain γ/, r/ Γ/ / 1 = 1 e r/ r/ t. 35 Cobining 34 and 35 will lead to the lea. Lea 4.3: Let r = α for soe constant α > 1. Then s < r α cα r, 36 in which cα = e α 1/α /α 1/α > 1 and cα e as α grows. oof: Note that for t < r, the function ft = r t / is strictly increasing. By observing that 1 < r, and eploying Lea 4. we get: s < e r/ 1 r/ t < e r/ r/ / /! a < e r/ r/ / /e / = r e α 1/α r, α α 1/α where in a we used the inequality! > /e. It can be verified that cα > 1 for α > 1. Although we do not prove it here, it is not hard to see that the upper bound shows a linear decay for α 1. In the following theore, we provide an upper bound on the perforance of the estiator. Based on Lea 4.1, the probability of error is related to the iniu distance between x and its projections onto the other subspaces. Let d in in x p s x, β = s :s s d in /4σ and r in = β. Theore 4.4: For β > 1, the perforance of the estiator is upper bounded as tr[covŝ] < p cβ rin. 37 oof: By Lea 4.1, we now that if ɛ < d in /, aes the correct choice. Therefore, fro Lea 4.3, we obtain err < r in β cβ rin, 38
5 Since s s i < p and err = N i= s i, we obtain tr[covŝ] = N i= s i s s i < p r in β cβ rin = p cβ rin. In general, the estiator can be biased and its perforance cannot be copared with unbiased estiators. The following theore provides us with the condition under which it becoes unbiased. Theore 4.5: Under the conditions 1 + ε log p/β log cβ, for soe fixed ε > 0 and β bounded away fro 1, the estiator is asyptotically unbiased as p. oof: Let ŝ = ˆn 1, ˆn,..., ˆn be the estiate for the true support set s = n 1, n,..., n. Then Eŝ = N i=1 ŝ i ŝ i = s s + ŝ i ŝ i. Since ŝ i s ŝ i = err and 1 ˆn i p for 1 i, ŝ i ŝ i p, p,..., p err. Since β > 1, fro 38 we get li ŝ i ŝ i li p, p,..., p cβ β a = 0, where in a, we used 1 + ε log p/β log cβ. Obviously, s 1 as p. Hence Eŝ = s. As we observe, our results do not depend on any specific easureent atrix. On the one hand, Theore 4.4 provides us with an upper bound on the error of the estiator. On the other hand, since by Theore 4.5, the estiator is unbiased under the entioned conditions, its estiation error is lower bounded by the HCR bound, which shows a 9 db gap in the denoinator with the upper bound. Therefore, such asyptotic behavior of the estiator, shows the achievability of the HCR bound, under the entioned conditions. In the following, we see how Theore 4.4 leads to find the sufficient nuber of easureents for reliable l -nor support recovery where the Gaussian easureent enseble is used. A. Sufficient Conditions Theore 4.4 provides us with an upper bound on the perforance of the estiator. For reliable l -nor support recovery, the RHS of 37 should go to zero as p. To that end, one should ae sure that first, β is bounded away fro one, which is a property of the underlying easureent Necessary conditions Sufficient conditions = Θp θin = Θ ` 1 Θ p log p = Θp θin Θp Θp = Θ 1 = op θin = Θ ` 1 Θ logp = op θin Θ Θ ` log p = Θ 1 TABLE I NECESSARY AND SUFFICIENT CONDITIONS ON THE NUMBER OF MEASUREMENTS REQUIRED FOR RELIABLE l -NORM SUPPORT RECOVERY. atrix and second, that the nuber of easureents is at least of the order of log p which assures that the estiator is unbiased. Theore 4.6: Let the easureent atrix Φ be drawn with i.i.d. eleents fro a Gaussian distribution N 0, 1. If the iniu value reains constant eaning θ in = Θ1, then = Θ log p nuber of easureents suffices to ensure reliable l -nor support recovery. oof: The proof follows the sae lines as [6] to show that both entioned conditions are siultaneously satisfied. We refer the reader to our technical report [11] for details. Sufficient conditions in different regies are shown in Table I. ACKNOWLEDGMENT The authors would lie to than of. M. J. Wainwright for his help and any useful coents. REFERENCES [1] D. L. Donoho, Copressed sensing, IEEE Trans. Infor. Theory, vol. 5, no. 4, pp , Apr [] E.J. Candes, J. Roberg, and T. Tao, Robust uncertainty principles: exact signal reconstruction fro highly incoplete frequency inforation, IEEE Trans. Infor. Theory, vol. 5, no., pp , Feb [3] M. Acaaya and V. Taroh, Shannon theoretic liits on noisy copressive sapling, Tech. Rep. arxiv:cs.it: , Harvard Univ., Noveber 007. [4] A. K. Fletcher, S. Rangan, and V. K. Goyal, Necessary and sufficient conditions on sparsity pattern recovery, Tech. Rep. arxiv:cs.it: , UC Bereley, April 008. [5] S. Sarvotha, D. Baron, and R. G. Baraniu, Measureents versus bits: Copressed sensing eets inforation theory, in oc. Allerton Conference on Control, Counication and Coputing, Septeber 006. [6] M. J. Wainwright, Inforation-theoretic bounds for sparsity recovery: Dense versus sparse easureents, Tech. Rep. 76, Departent of Statistics, UC Bereley, April 008. [7] J. M. Haersley, On estiating restricted paraeters, J. Roy. Stat. Soc., vol. 1, no., pp , [8] D.G. Chapan and H. Robbins, Miniu variance estiation without regularity assuption, Annals Math. Stat., vol., no. 4, pp , [9] P. Stoica and R. Moses, Introduction to Spectral Analysis, entice-hall, 000. [10] B. Laurent and P. Massart, Adaptive estiation of a quadratic functional by odel selection, Annals of Statistics, vol. 8, no. 5, pp , [11] A. Karbasi, A. Horati, S. Mohajer, R. Urbane, and M. Vetterli, Support recovery in copressed sensing: An estiation theory approach, Tech. Rep., EPFL, January 008, Available at
Randomized Recovery for Boolean Compressed Sensing
Randoized Recovery for Boolean Copressed Sensing Mitra Fatei and Martin Vetterli Laboratory of Audiovisual Counication École Polytechnique Fédéral de Lausanne (EPFL) Eail: {itra.fatei, artin.vetterli}@epfl.ch
More informationA Simple Regression Problem
A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where
More informationBlock designs and statistics
Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent
More informationCompressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements
1 Copressive Distilled Sensing: Sparse Recovery Using Adaptivity in Copressive Measureents Jarvis D. Haupt 1 Richard G. Baraniuk 1 Rui M. Castro 2 and Robert D. Nowak 3 1 Dept. of Electrical and Coputer
More informationNon-Parametric Non-Line-of-Sight Identification 1
Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,
More informationSupplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion
Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish
More informationCompressive Sensing Over Networks
Forty-Eighth Annual Allerton Conference Allerton House, UIUC, Illinois, USA Septeber 29 - October, 200 Copressive Sensing Over Networks Soheil Feizi MIT Eail: sfeizi@it.edu Muriel Médard MIT Eail: edard@it.edu
More informationSharp Time Data Tradeoffs for Linear Inverse Problems
Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used
More informationVulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links
Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Tie-Varying Jaing Links Jun Kurihara KDDI R&D Laboratories, Inc 2 5 Ohara, Fujiino, Saitaa, 356 8502 Japan Eail: kurihara@kddilabsjp
More informationFixed-to-Variable Length Distribution Matching
Fixed-to-Variable Length Distribution Matching Rana Ali Ajad and Georg Böcherer Institute for Counications Engineering Technische Universität München, Gerany Eail: raa2463@gail.co,georg.boecherer@tu.de
More informationMachine Learning Basics: Estimators, Bias and Variance
Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics
More informationPhysics 215 Winter The Density Matrix
Physics 215 Winter 2018 The Density Matrix The quantu space of states is a Hilbert space H. Any state vector ψ H is a pure state. Since any linear cobination of eleents of H are also an eleent of H, it
More informationESE 523 Information Theory
ESE 53 Inforation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electrical and Systes Engineering Washington University 11 Urbauer Hall 10E Green Hall 314-935-4173 (Lynda Marha Answers) jao@wustl.edu
More informationTEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES
TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES S. E. Ahed, R. J. Tokins and A. I. Volodin Departent of Matheatics and Statistics University of Regina Regina,
More informationLower Bounds for Quantized Matrix Completion
Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &
More informationOn the theoretical analysis of cross validation in compressive sensing
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.erl.co On the theoretical analysis of cross validation in copressive sensing Zhang, J.; Chen, L.; Boufounos, P.T.; Gu, Y. TR2014-025 May 2014 Abstract
More informationRecovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup)
Recovering Data fro Underdeterined Quadratic Measureents (CS 229a Project: Final Writeup) Mahdi Soltanolkotabi Deceber 16, 2011 1 Introduction Data that arises fro engineering applications often contains
More informationThis model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.
CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when
More information13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices
CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay
More informationWeighted- 1 minimization with multiple weighting sets
Weighted- 1 iniization with ultiple weighting sets Hassan Mansour a,b and Özgür Yılaza a Matheatics Departent, University of British Colubia, Vancouver - BC, Canada; b Coputer Science Departent, University
More informationDetection and Estimation Theory
ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer
More informationOptimal Jamming Over Additive Noise: Vector Source-Channel Case
Fifty-first Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 2-3, 2013 Optial Jaing Over Additive Noise: Vector Source-Channel Case Erah Akyol and Kenneth Rose Abstract This paper
More informationOn Constant Power Water-filling
On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives
More informationESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics
ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents
More informationWeighted Superimposed Codes and Constrained Integer Compressed Sensing
Weighted Superiposed Codes and Constrained Integer Copressed Sensing Wei Dai and Olgica Milenovic Dept. of Electrical and Coputer Engineering University of Illinois, Urbana-Chapaign Abstract We introduce
More informationare equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,
Page of 8 Suppleentary Materials: A ultiple testing procedure for ulti-diensional pairwise coparisons with application to gene expression studies Anjana Grandhi, Wenge Guo, Shyaal D. Peddada S Notations
More informationA Note on the Applied Use of MDL Approximations
A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention
More informationList Scheduling and LPT Oliver Braun (09/05/2017)
List Scheduling and LPT Oliver Braun (09/05/207) We investigate the classical scheduling proble P ax where a set of n independent jobs has to be processed on 2 parallel and identical processors (achines)
More informationApproximation in Stochastic Scheduling: The Power of LP-Based Priority Policies
Approxiation in Stochastic Scheduling: The Power of -Based Priority Policies Rolf Möhring, Andreas Schulz, Marc Uetz Setting (A P p stoch, r E( w and (B P p stoch E( w We will assue that the processing
More informationRecovery of Sparsely Corrupted Signals
TO APPEAR IN IEEE TRANSACTIONS ON INFORMATION TEORY 1 Recovery of Sparsely Corrupted Signals Christoph Studer, Meber, IEEE, Patrick Kuppinger, Student Meber, IEEE, Graee Pope, Student Meber, IEEE, and
More informationFeature Extraction Techniques
Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that
More informationCorrelated Bayesian Model Fusion: Efficient Performance Modeling of Large-Scale Tunable Analog/RF Integrated Circuits
Correlated Bayesian odel Fusion: Efficient Perforance odeling of Large-Scale unable Analog/RF Integrated Circuits Fa Wang and Xin Li ECE Departent, Carnegie ellon University, Pittsburgh, PA 53 {fwang,
More informationKernel Methods and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic
More informationLeast Squares Fitting of Data
Least Squares Fitting of Data David Eberly, Geoetric Tools, Redond WA 98052 https://www.geoetrictools.co/ This work is licensed under the Creative Coons Attribution 4.0 International License. To view a
More information1 Rademacher Complexity Bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #10 Scribe: Max Goer March 07, 2013 1 Radeacher Coplexity Bounds Recall the following theore fro last lecture: Theore 1. With probability
More informationE0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis
E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds
More informationHighly Robust Error Correction by Convex Programming
Highly Robust Error Correction by Convex Prograing Eanuel J. Candès and Paige A. Randall Applied and Coputational Matheatics, Caltech, Pasadena, CA 9115 Noveber 6; Revised Noveber 7 Abstract This paper
More informationThe proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013).
A Appendix: Proofs The proofs of Theore 1-3 are along the lines of Wied and Galeano (2013) Proof of Theore 1 Let D[d 1, d 2 ] be the space of càdlàg functions on the interval [d 1, d 2 ] equipped with
More informationSupport Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization
Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering
More informationA note on the multiplication of sparse matrices
Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani
More informationUsing EM To Estimate A Probablity Density With A Mixture Of Gaussians
Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points
More informationSupport recovery in compressive sensing for estimation of Direction-Of-Arrival
Support recovery in compressive sensing for estimation of Direction-Of-Arrival Zhiyuan Weng and Xin Wang Department of Electrical and Computer Engineering Stony Brook University Abstract In the estimation
More informationPolygonal Designs: Existence and Construction
Polygonal Designs: Existence and Construction John Hegean Departent of Matheatics, Stanford University, Stanford, CA 9405 Jeff Langford Departent of Matheatics, Drake University, Des Moines, IA 5011 G
More informationEffective joint probabilistic data association using maximum a posteriori estimates of target states
Effective joint probabilistic data association using axiu a posteriori estiates of target states 1 Viji Paul Panakkal, 2 Rajbabu Velurugan 1 Central Research Laboratory, Bharat Electronics Ltd., Bangalore,
More informationHamming Compressed Sensing
Haing Copressed Sensing Tianyi Zhou, and Dacheng Tao, Meber, IEEE Abstract arxiv:.73v2 [cs.it] Oct 2 Copressed sensing CS and -bit CS cannot directly recover quantized signals and require tie consuing
More informationOn Conditions for Linearity of Optimal Estimation
On Conditions for Linearity of Optial Estiation Erah Akyol, Kuar Viswanatha and Kenneth Rose {eakyol, kuar, rose}@ece.ucsb.edu Departent of Electrical and Coputer Engineering University of California at
More informationA Low-Complexity Congestion Control and Scheduling Algorithm for Multihop Wireless Networks with Order-Optimal Per-Flow Delay
A Low-Coplexity Congestion Control and Scheduling Algorith for Multihop Wireless Networks with Order-Optial Per-Flow Delay Po-Kai Huang, Xiaojun Lin, and Chih-Chun Wang School of Electrical and Coputer
More informationCombining Classifiers
Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/
More informationON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD
PROCEEDINGS OF THE YEREVAN STATE UNIVERSITY Physical and Matheatical Sciences 04,, p. 7 5 ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD M a t h e a t i c s Yu. A. HAKOPIAN, R. Z. HOVHANNISYAN
More informationAn RIP-based approach to Σ quantization for compressed sensing
An RIP-based approach to Σ quantization for copressed sensing Joe-Mei Feng and Felix Kraher October, 203 Abstract In this paper, we provide new approach to estiating the error of reconstruction fro Σ quantized
More informationA Simple Homotopy Algorithm for Compressive Sensing
A Siple Hootopy Algorith for Copressive Sensing Lijun Zhang Tianbao Yang Rong Jin Zhi-Hua Zhou National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China Departent of Coputer
More informationKeywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution
Testing approxiate norality of an estiator using the estiated MSE and bias with an application to the shape paraeter of the generalized Pareto distribution J. Martin van Zyl Abstract In this work the norality
More information3.8 Three Types of Convergence
3.8 Three Types of Convergence 3.8 Three Types of Convergence 93 Suppose that we are given a sequence functions {f k } k N on a set X and another function f on X. What does it ean for f k to converge to
More informationOn the Use of A Priori Information for Sparse Signal Approximations
ITS TECHNICAL REPORT NO. 3/4 On the Use of A Priori Inforation for Sparse Signal Approxiations Oscar Divorra Escoda, Lorenzo Granai and Pierre Vandergheynst Signal Processing Institute ITS) Ecole Polytechnique
More informationThis article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and
This article appeared in a ournal published by Elsevier. The attached copy is furnished to the author for internal non-coercial research and education use, including for instruction at the authors institution
More informationAn Adaptive UKF Algorithm for the State and Parameter Estimations of a Mobile Robot
Vol. 34, No. 1 ACTA AUTOMATICA SINICA January, 2008 An Adaptive UKF Algorith for the State and Paraeter Estiations of a Mobile Robot SONG Qi 1, 2 HAN Jian-Da 1 Abstract For iproving the estiation accuracy
More informationTesting equality of variances for multiple univariate normal populations
University of Wollongong Research Online Centre for Statistical & Survey Methodology Working Paper Series Faculty of Engineering and Inforation Sciences 0 esting equality of variances for ultiple univariate
More informationA PROBABILISTIC AND RIPLESS THEORY OF COMPRESSED SENSING. Emmanuel J. Candès Yaniv Plan. Technical Report No November 2010
A PROBABILISTIC AND RIPLESS THEORY OF COMPRESSED SENSING By Eanuel J Candès Yaniv Plan Technical Report No 200-0 Noveber 200 Departent of Statistics STANFORD UNIVERSITY Stanford, California 94305-4065
More informationIn this chapter, we consider several graph-theoretic and probabilistic models
THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions
More informationLecture October 23. Scribes: Ruixin Qiang and Alana Shine
CSCI699: Topics in Learning and Gae Theory Lecture October 23 Lecturer: Ilias Scribes: Ruixin Qiang and Alana Shine Today s topic is auction with saples. 1 Introduction to auctions Definition 1. In a single
More informationProc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES
Proc. of the IEEE/OES Seventh Working Conference on Current Measureent Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Belinda Lipa Codar Ocean Sensors 15 La Sandra Way, Portola Valley, CA 98 blipa@pogo.co
More informationTopic 5a Introduction to Curve Fitting & Linear Regression
/7/08 Course Instructor Dr. Rayond C. Rup Oice: A 337 Phone: (95) 747 6958 E ail: rcrup@utep.edu opic 5a Introduction to Curve Fitting & Linear Regression EE 4386/530 Coputational ethods in EE Outline
More informationA general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics, EPFL, Lausanne Phone: Fax:
A general forulation of the cross-nested logit odel Michel Bierlaire, EPFL Conference paper STRC 2001 Session: Choices A general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics,
More information1 Proof of learning bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a
More informationCS Lecture 13. More Maximum Likelihood
CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood
More informationA Probabilistic and RIPless Theory of Compressed Sensing
A Probabilistic and RIPless Theory of Copressed Sensing Eanuel J Candès and Yaniv Plan 2 Departents of Matheatics and of Statistics, Stanford University, Stanford, CA 94305 2 Applied and Coputational Matheatics,
More informationPattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition
More informationBipartite subgraphs and the smallest eigenvalue
Bipartite subgraphs and the sallest eigenvalue Noga Alon Benny Sudaov Abstract Two results dealing with the relation between the sallest eigenvalue of a graph and its bipartite subgraphs are obtained.
More informationASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical
IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul
More informationBoosting with log-loss
Boosting with log-loss Marco Cusuano-Towner Septeber 2, 202 The proble Suppose we have data exaples {x i, y i ) i =... } for a two-class proble with y i {, }. Let F x) be the predictor function with the
More informationMultivariate Methods. Matlab Example. Principal Components Analysis -- PCA
Multivariate Methos Xiaoun Qi Principal Coponents Analysis -- PCA he PCA etho generates a new set of variables, calle principal coponents Each principal coponent is a linear cobination of the original
More informationA NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS
A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS Marta Martinez-Caara 1, Michael Mua 2, Abdelhak M. Zoubir 2, Martin Vetterli 1 1 School of Coputer and Counication
More informationDesign of Spatially Coupled LDPC Codes over GF(q) for Windowed Decoding
IEEE TRANSACTIONS ON INFORMATION THEORY (SUBMITTED PAPER) 1 Design of Spatially Coupled LDPC Codes over GF(q) for Windowed Decoding Lai Wei, Student Meber, IEEE, David G. M. Mitchell, Meber, IEEE, Thoas
More informationAsynchronous Gossip Algorithms for Stochastic Optimization
Asynchronous Gossip Algoriths for Stochastic Optiization S. Sundhar Ra ECE Dept. University of Illinois Urbana, IL 680 ssrini@illinois.edu A. Nedić IESE Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu
More informationIntelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes
More informationINNER CONSTRAINTS FOR A 3-D SURVEY NETWORK
eospatial Science INNER CONSRAINS FOR A 3-D SURVEY NEWORK hese notes follow closely the developent of inner constraint equations by Dr Willie an, Departent of Building, School of Design and Environent,
More informationCOS 424: Interacting with Data. Written Exercises
COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well
More informationarxiv: v1 [cs.ds] 17 Mar 2016
Tight Bounds for Single-Pass Streaing Coplexity of the Set Cover Proble Sepehr Assadi Sanjeev Khanna Yang Li Abstract arxiv:1603.05715v1 [cs.ds] 17 Mar 2016 We resolve the space coplexity of single-pass
More informationSTOPPING SIMULATED PATHS EARLY
Proceedings of the 2 Winter Siulation Conference B.A.Peters,J.S.Sith,D.J.Medeiros,andM.W.Rohrer,eds. STOPPING SIMULATED PATHS EARLY Paul Glasseran Graduate School of Business Colubia University New Yor,
More informationLecture 20 November 7, 2013
CS 229r: Algoriths for Big Data Fall 2013 Prof. Jelani Nelson Lecture 20 Noveber 7, 2013 Scribe: Yun Willia Yu 1 Introduction Today we re going to go through the analysis of atrix copletion. First though,
More information3.3 Variational Characterization of Singular Values
3.3. Variational Characterization of Singular Values 61 3.3 Variational Characterization of Singular Values Since the singular values are square roots of the eigenvalues of the Heritian atrices A A and
More informationBiostatistics Department Technical Report
Biostatistics Departent Technical Report BST006-00 Estiation of Prevalence by Pool Screening With Equal Sized Pools and a egative Binoial Sapling Model Charles R. Katholi, Ph.D. Eeritus Professor Departent
More informationLARGE DEVIATIONS AND RARE EVENT SIMULATION FOR PORTFOLIO CREDIT RISK
LARGE DEVIATIONS AND RARE EVENT SIMULATION FOR PORTFOLIO CREDIT RISK by Hasitha de Silva A Dissertation Subitted to the Graduate Faculty of George Mason University In Partial fulfillent of The Requireents
More informationInteractive Markov Models of Evolutionary Algorithms
Cleveland State University EngagedScholarship@CSU Electrical Engineering & Coputer Science Faculty Publications Electrical Engineering & Coputer Science Departent 2015 Interactive Markov Models of Evolutionary
More informationFairness via priority scheduling
Fairness via priority scheduling Veeraruna Kavitha, N Heachandra and Debayan Das IEOR, IIT Bobay, Mubai, 400076, India vavitha,nh,debayan}@iitbacin Abstract In the context of ulti-agent resource allocation
More informationKernel-Based Nonparametric Anomaly Detection
Kernel-Based Nonparaetric Anoaly Detection Shaofeng Zou Dept of EECS Syracuse University Eail: szou@syr.edu Yingbin Liang Dept of EECS Syracuse University Eail: yliang6@syr.edu H. Vincent Poor Dept of
More informationCan the Threshold Performance of Maximum Likelihood DOA Estimation be Improved by Tools from Random Matrix Theory?
Advances in Signal Processing 2(): 8-28, 204 DOI: 0.389/asp.204.02003 http://www.hrpub.org Can the Threshold Perforance of axiu Likelihood DOA Estiation be Iproved by Tools fro Rando atrix Theory? Yuri
More informationUnderstanding Machine Learning Solution Manual
Understanding Machine Learning Solution Manual Written by Alon Gonen Edited by Dana Rubinstein Noveber 17, 2014 2 Gentle Start 1. Given S = ((x i, y i )), define the ultivariate polynoial p S (x) = i []:y
More informationQuantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search
Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths
More informationAntenna Saturation Effects on MIMO Capacity
Antenna Saturation Effects on MIMO Capacity T S Pollock, T D Abhayapala, and R A Kennedy National ICT Australia Research School of Inforation Sciences and Engineering The Australian National University,
More informationFundamental Limits of Database Alignment
Fundaental Liits of Database Alignent Daniel Cullina Dept of Electrical Engineering Princeton University dcullina@princetonedu Prateek Mittal Dept of Electrical Engineering Princeton University pittal@princetonedu
More informationExperimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis
City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna
More informationChapter 6 1-D Continuous Groups
Chapter 6 1-D Continuous Groups Continuous groups consist of group eleents labelled by one or ore continuous variables, say a 1, a 2,, a r, where each variable has a well- defined range. This chapter explores:
More informationThe Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters
journal of ultivariate analysis 58, 96106 (1996) article no. 0041 The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Paraeters H. S. Steyn
More informationarxiv: v5 [cs.it] 16 Mar 2012
ONE-BIT COMPRESSED SENSING BY LINEAR PROGRAMMING YANIV PLAN AND ROMAN VERSHYNIN arxiv:09.499v5 [cs.it] 6 Mar 0 Abstract. We give the first coputationally tractable and alost optial solution to the proble
More informationBayes Decision Rule and Naïve Bayes Classifier
Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.
More informationANALYSIS OF HALL-EFFECT THRUSTERS AND ION ENGINES FOR EARTH-TO-MOON TRANSFER
IEPC 003-0034 ANALYSIS OF HALL-EFFECT THRUSTERS AND ION ENGINES FOR EARTH-TO-MOON TRANSFER A. Bober, M. Guelan Asher Space Research Institute, Technion-Israel Institute of Technology, 3000 Haifa, Israel
More informationTail estimates for norms of sums of log-concave random vectors
Tail estiates for nors of sus of log-concave rando vectors Rados law Adaczak Rafa l Lata la Alexander E. Litvak Alain Pajor Nicole Toczak-Jaegerann Abstract We establish new tail estiates for order statistics
More informationAn Improved Particle Filter with Applications in Ballistic Target Tracking
Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing
More informationPULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE
PULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE 1 Nicola Neretti, 1 Nathan Intrator and 1,2 Leon N Cooper 1 Institute for Brain and Neural Systes, Brown University, Providence RI 02912.
More information