Entropy power inequality for a family of discrete random variables
|
|
- Marianna Barton
- 5 years ago
- Views:
Transcription
1 20 IEEE International Symposium on Information Theory Proceedings Entropy power inequality for a family of discrete random variables Naresh Sharma, Smarajit Das and Siddharth Muthurishnan School of Technology and Computer Science Tata Institute of Fundamental Research Mumbai , India {nsharma,smarajit}@tifr.res.in, smrish@tcs.tifr.res.in. Abstract It is nown that the Entropy Power Inequality (EPI) always holds if the random variables have density. Not much wor has been done to identify discrete distributions for which the inequality holds with the differential entropy replaced by the discrete entropy. Harremoës and Vignat showed that it holds for the pair (B(m,p),B(n, p)), m,n N, (where B(n, p) is a Binomial distribution with n trials each with success probability p) forp =0.5. Inthispaper, we considerably expand the set of Binomial distributions for which the inequality holds and, in particular, identify n 0(p) such that for all m, n n 0(p), theepi holds for (B(m,p),B(n, p)). Wefurthershowthatthe EPI holds for the discrete random variables that can be expressed as the sum of n independent and identically distributed (IID) discrete random variables for large n. Index Terms Entropy power inequality, Taylor s theorem, asymptotic series, binomial distribution. I. INTRODUCTION The Entropy Power Inequality e 2h(X+Y ) e 2h(X) +e 2h(Y ) () holds for independent random variables X and Y with densities, where h( ) is the differential entropy. It was first stated by Shannon in Ref. [], and the proof was given by Stam and Blachman [2]. See also Refs. [3], [4], [5], [6], [7], [8], [9]. This inequality is, in general, not true for discrete distributions where the differential entropy is replaced by the discrete entropy. For some special cases (binary random variables with modulo 2 addition), results have been provided by Shamai and Wyner in Ref. [0]. More recently, Harremoës and Vignat have shown that this inequality will hold if X and Y are B(n, /2) and B(m, /2) respectively for all m, n []. Significantly, the convolution operation to get the distribution of X +Y is performed over the usual addition over reals and not over finite fields. Recently, another approach has been expounded by Harremoës et. al. [2] and by Johnson and Yu [3], wherein they interpret Rényi s thinning operation on a discrete random variable as a discrete analog of the scaling operation for continuous random variables. They provide inequalities for the convolutions of thinned discrete random variables that can be interpreted as the discrete analogs of the ones for the continuous case. In this paper, we tae a re-loo at the result by Harremoës and Vignat [] for the Binomial family and extend it for all p (0, ). Weshowthatthere always exists an n 0 (p) that is a function of p, such that for all m, n n 0 (p), e 2H[B(m+n,p)] e 2H[B(m,p)] +e 2H[B(n,p)], (2) whereh( ) is the discrete entropy. The result in Ref. [] is a special case of our result since we obtain n 0 (0.5) = 7 and it can be checed numerically by using a sufficient condition that the inequality holds for m, n 6. We then extend our results for the family of discrete random variables that can be written as the sum of n IID random variables and show that for large n, EPI holds. We also loo at the semi-asymptotic case for the distributions B(m, p) with m small and B(n, p) with n large. However, when n is large, there may exist some m such that EPI does not hold. The proofs for all the lemmas and theorems except that of Theorem 2 and 3 are omitted due to lac of space, but they are available in Ref. [4]. II. EPI FOR THE BINOMIAL DISTRIBUTION Our aim is to have an estimate on the threshold n 0 (p) such that e 2H[B(m+n,p)] e 2H[B(m,p)] +e 2H[B(n,p)], (3) //$ IEEE 945
2 holds for all m, n n 0 (p). It is observed that n 0 (p) depends on the sewness of the associated Bernoulli distribution. Sewness of a probability distribution is defined as κ 3 / κ 3 2 where κ 2 and κ 3 are respectively the second and third cumulants of the Bernoulli distribution B(,p), anditturnsouttobe(2p )/ p( p). Let (2p )2 ω(p) = p( p). (4) We find an expression for n 0 (p) that depends on ω(p). The well nown Taylor s theorem will be useful for this purpose (see for example p. 0 in Ref. [5]) and is stated as follows. Suppose f is a real function on [a, b], n N, the (n )th derivative of f denoted by f (n ) is continuous on [a, b], andf (n) (t) exists for all t (a, b). Letα, β be distinct points of [a, b], thenthere exists a point y between α and β such that n f(β) = f(α)+ f () (α) (β α)! + f (n) (y) (β α) n. (5) n! For 0 p, leth(p) denote the discrete entropy of a Bernoulli distribution with probability of success p, thatis,h(p) p log(p) ( p)log( p). Weshallusethenaturallogarithm throughout this paper. Note that we earlier defined H( ) to be the discrete entropy of a discrete random variable. The definition to be used would be amply clear from the context in what follows. Let Ĥ(x) H(p) H(x), x (0, ), (6) F () (x) Ĥ() (x). (7)! Note that Ĥ(x) satisfies the assumptions in the Taylor s theorem in x (0, ). Therefore,wecan write n Ĥ(x) = Ĥ(p)+ F () (x)(x p) + F (n) (x )(x p) n, (8) for some x (x, p). NotethatĤ(p) =0. For even, F () (x) 0 for all x (0, ), and hence, Ĥ(x) 2l+ F () (p)(x p) (9) for all x (0, ) and any non-negative integer l. The following useful identity would be employed at times 2 2ν ( log(2) H(p) = p 2ν. (0) 2ν(2ν ) 2) ν= Let P {p i } and Q {q i } be two probability measures over a finite alphabet A. LetC (p) (P, Q) and ν (p) (P, Q) be measures of discrimination defined as C (p) (P, Q) pd(p M)+qD(Q M), () ν (p) pp i qq i 2ν, (pp i + qq i ) 2ν i A (2) where M = pp + qq,q = p and D( ) is the Kullbac-Leibler divergence. These quantities are generalized capacitory discrimination and triangular discrimination of order ν respectively that were introduced by Topsøe [6]. The following theorem relates C (p) (P, Q) with (P, Q) and would be used later to derive an expression for n 0 (p). ItgeneralizesTheoremin Ref. [6]. (p) ν Theorem. Let P and Q be two distributions over the alphabet A and 0 <p<. Then C (p) (P, Q) = ν= ν (p) (P, Q) [log(2) H(p)]. 2ν(2ν ) Let X (n) be a discrete random variable that can be written as X (n) = Z +Z 2 + +Z n,wherez i s are IID random variables. We note that when X (n) is defined as above, we have X (n) +X (m) = X (n+m). Let Y n e 2[H(X(n) )].Wefirstusealemmadueto Harremoës and Vignat []. Lemma (Harremoës and Vignat []). If Y n /n is increasing, then Y n is super-additive, i.e., Y m+n Y m + Y n. It is not difficult to show that this is a sufficient condition for the EPI to hold []. By the above lemma, the inequality H(X (n+) ) H(X (n) ) 2 log ( n + n ) (3) is sufficient for EPI to hold. Let X (n) = B(n, p). We have P X (n+)( +)=pp X (n)()+qp X (n)( +). (4) 946
3 f(2,, p) p Fig.. Plot of e 2H[B(3,p)] e 2H[B(2,p)] +e 2H[B(,p)] versus p. Define a random variable X (n) + as P X (n) + ( + ) = P X (n)() for all {0,,,n}. Hence, using H(X (n) +)=H(X (n) ), P X (n+) = pp X (n) + +qp X (n) and (), we generalize Eq. (3.7) in Ref. [] as H(X (n+) )=H(X (n) )+C (p) (P X (n) +,P X (n)). (5) We now derive the lower bound for C (p) (P X (n) +,P X (n)). Lemma 2. For l N, 2l+ C (p) (P X (n),p X (n) + ) X F () (p)(n +) µ (n+), where µ (n) is the -th central moment of B(n, p), i.e., n µ (n) = (i np) P X (n)(i). (6) i=0 We note that there exist m, n for p 0.5 for which EPI does not hold, that is, e 2H[B(m+n,p)] < e 2H[B(m,p)] +e 2H[B(n,p)]. In fact, one can easily see that e 2H[B(2,p)] e 2H[B(,p)] +e 2H[B(,p)] p, with equality if and only if p =0.5. For the case m =and n =2,itisclearfromth Fig. that EPI holds when p is close to 0.5, while EPI does not hold if p is close to 0 or. Thisleads us to the question that for a given p, whatshould m, n be such that the EPI would hold. The main theorem of this section answers this question. Theorem 2. H[B(n +,p)] H[B(n, p)] ( ) n + 2 log n n n 0 (p). Several candidates of n 0 (p) are possible such as n 0 (p) = 4.44 ω(p) +7 and n 0 (p) = ω(p) ω(p)+7. Proof: See the Appendix. Using Lemma 2, we can obtain a non-asymptotic lower bound to the entropy of the binomial distribution unlie the asymptotic expansion of H[B(n, p)] given in Ref. [7]. For details, see Ref. [4]. III. EPI FOR THE SUM OF IID We showed in the previous section that EPI holds for the pair (B(n, p),b(m, p)) for all m, n n 0 (p). The question naturally arises whether EPI holds for all such discrete random variables that can be expressed as sum of IID random variables. Let X (n) be a discrete random variable such that X (n) X + X X n, (7) where X i s are IID random variables and σ 2 is the variance of X.Theseries l= a lφ l (n) is said to be an asymptotic expansion for f(n) as n if N f(n) = a l φ l (n)+o[φ N (n)] N, (8) l= and is written as f(n) l a lφ l (n), where if γ(n) o[φ(n)], thenlim n γ(n)/φ(n) =0.We shall use the asymptotic expansion due to Knessl [7]. 947
4 Lemma 3 (Knessl [7]). For a random variable X (n),asdefinedabove,havingfinitemoments,we have as n, g(n) H(X (n) ) 2 log(2πenσ2 ) κ2 3 2σ 6 n + β l, (9) nl+ l= where κ j is the jth cumulant of of X.Ifκ 3 = κ 4 = = κ N =0but κ N+ 0,then g(n) 2(N +)!σ 2N+2 n N + l=n β l n l+. Note that the leading term in the asymptotic expansion is always negative. We also note using Lemma 3 that as n, H(X (n) ) < 2 log(2πenσ2 ). (20) To see this, we invoe the definition of the asymptotic series to get g(n) = 2(N +)!σ 2N+2 n N + β ( ) N n N +o n N. From the definition of the little-oh notation, we now that given any ɛ>0, thereexistsal(ɛ) > 0 such that for all n>l(ɛ), g(n) = 2(N +)!σ 2N+2 n N + β N + ɛ n N. Choosing n large enough, we get the desired result. We consider the case of the pair (X (n),x (m) ) when both m, n are large and have the following result. Theorem 3. There exists a n 0 N such that e 2H(X(m) +X (n)) e 2H(X(m)) +e 2H(X(n) ) for all m, n n 0. (2) Proof: We shall prove the sufficient condition for the EPI to hold (as per Lemma ) and show that H(X (n+) ) H(X (n) ) ( ) n + 2 log (22) n for n n 0 for some n 0 N. Let us tae the first three terms in the above asymptotic series as g(n) C n + C 2 n 2 + C 3 n 3 (23) where 0 < < 2 < 3 and C is some non-zero positive constant, and hence, g(n)+ C n C 2 n C ( ) 3 2 n = o. (24) 3 n 3 and given any ɛ>0, thereexistsal(ɛ) > 0 such that for all n>l(ɛ), g(n)+ C n C 2 n C 3 2 n ɛ 3 n. (25) 3 From the above inequality, we have, by using the lower and upper bounds respectively for g(n +) and g(n), that [ ] g(n +) g(n) C + [ C 2 (n +) 2 n 2 n (n +) [ C3 ɛ (n +) C 3 + ɛ 3 n 3 ] + ]. From the above expression, we can clearly see that the first term is strictly positive and is Θ ( /n +). The second and third terms (their signs are irrelevant) are of the order Θ(/n 2+ ) and Θ(/n 3 ) respectively. The Θ( ) means both asymptotic lower and asymptotic upper bounding. It is clear that there exists some positive integer n 0 such that for all n n 0,first(positive)termwilldominateandthe other two terms will be negligible compared to the first and hence g(n +) g(n) 0. It must be noted that if one considers the pair (X (n),x (m) ) where n and m is fixed, then the EPI need not hold. As can can be seen in our paper [4], it can be shown that e 2H(X(m) +X (n)) e 2H(X(m)) +e 2H(X(n) ) if n and H(X (m) ) < 2 log[2πemσ2 ]. IV. CONCLUSIONS We have expanded the set of pairs of Binomial distributions for which the EPI holds. We identified athresholdthatisafunctionoftheprobabilityof success beyond which the EPI holds. We further show that EPI would hold for discrete random variables that can be written as sum of IID random variables. We also obtain an improvement of a bound given by Artstein et. al. [6] (see Ref. [4]). It would be interesting to now if C (p) (P X (n) +,P X (n)) for X(n) = B(n, p) is a concave function in p. It would also be of interest to now that for a given p (0, 0.5) if H[B(n +,p)] H[B(n, p)] 0.5log( + /n) would have a single zero crossing as a function of n when n increases from to. 948
5 REFERENCES [] C. E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J., vol.27,pp and ,July and Oct [2] N. M. Blachman, The convolution inequality for entropy powers, IEEE Trans. Inf. Theory, vol., pp , Apr [3] M. H. M. Costa, A new entropy power inequality, IEEE Trans. Inf. Theory, vol.3,pp ,Nov.985. [4] A. Dembo, T. M. Cover, and J. A. Thomas, Information theoretic inequalities, IEEE Trans. Inf. Theory, vol. 37, pp , Nov. 99. [5] O. T. Johnson, Log-concavity and the maximum entropy property of the Poisson distribution, Stoch. Proc. Appl., vol. 7, pp , June [6] S. Artstein, K. M. Ball, F. Barthe, and A. Naor, Solution of Shannon s problem on the monotonicity of entropy, J. Amer. Math. Soc., vol.7,pp ,May2004. [7] A. M. Tulino and S. Verdú, Monotonic decrease of the non-gaussianness of the sum of independent random variables: A simple proof, IEEE Trans. Inf. Theory, vol. 52, pp , Sep [8] S. Verdú and D. Guo, A simple proof of the entropy-power inequality, IEEE Trans. Inf. Theory, vol. 52, pp , May [9] M. Madiman and A. Barron, Generalized entropy power inequalities and monotonicity properties of information, IEEE Trans. Inf. Theory, vol.53,pp ,July2007. [0] S. Shamai (Shitz) and A. D. Wyner, A binary analog to the entropy-power inequality, IEEE Trans. Inf. Theory, vol.36, pp , Nov [] P. Harremoës and C. Vignat, An entropy power inequality for the binomial family, J. Inequal. Pure Appl. Math., vol. 4, no. 5, Oct [2] P. Harremoës, O. Johnshon, and I. Kontoyiannis, Thinning, entropy, and the law of thin numbers, IEEE Trans. Inf. Theory, vol.56,pp ,Sep.200. [3] O. Johnshon and Y. Yu, Monotonicity, thinning, and discrete versions of the entropy power inequality, IEEE Trans. Inf. Theory, vol.56,pp ,Nov.200. [4] N. Sharma, S. Das, and S. Muthurishnan, Entropy power inequality for a family of discrete random variables, http: //arxiv.org/abs/02.042, Dec [5] W. Rudin, Principles of Mathematical Analysis, 3rd ed. McGraw-Hill, 976. [6] F. Topsøe, Some inequalities for information divergence and related measures of discrimination, IEEE Trans. Inf. Theory, vol.46,pp ,July2000. [7] C. Knessl, Integral representations and asymptotic expansions for Shannon and Renyi entropies, Appl. Math. Lett., vol., pp , Mar APPENDIX PROOF OF THEOREM 2 We prove that for all n n 0 (p), H[B(n +,p)] H[B(n, p)] n + 2 log n Using Lemma 2, we have H[B(n +,p)] H[B(n, p)] 2l+ «(26) F () (p)(n +) µ (n+). (27) Let r = p /2 and t = ω(p) =6r 2 /( 4r 2 ). We have r 2 = t/[4(t +4)]. Note that r 2 [0, /4) and t [0, ). Wewill use the first seven central moments of B(n, p) and they contain only even powers of r and hence, can be written as a function of t. We upper bound the right hand side of (26) as ( ) n + log n n 2n 2 + 3n 3. (28) Define f(n, t) 7X»r t F () 4(t +4) + (n +) µ (n+) 2 2 n + 2n «. 2 3n 3 Proving (26) is equivalent to showing that f(n, t) 0 n>n 0 (p). Simplifying h f(n, t) = 35n 7 t +(70+35t +35t 2 )n 6 420(n +) 6 n 3 ( t +3339t 2 +72t 3 )n 5 (826 72t 37t t t 4 )n 4 ( t +35t 2 +57t 3 +0t 5 +66t 4 )n 3 i 630n 2 35n 70. (29) Define g(n, t) 420(n +) 6 n 3 f(n, t). A simple but elaborate calculation yields g(4.44t +7+m, t) 35tm 7 +(22.8t t +70)m 6 +( t t t +2625)m 5 +(.0t t t t+0.40)0 5 m 4 +(3.85t t t t t+3.02)0 5 m 3 +(7.76t t t t t t +.80)0 5 m 2 +(6.47t t t t t t t )0 5 m +(0.5t t t t t t t t +64.5)0 4. Note that all the coefficients are positive and hence, f(4.44t +7+m, t) 0 for all m 0 or f(n, t) 0 for all n 4.44t +7. A more careful choice would yield all coefficients to be positive for n 4.438t +7.Yetanotherchoice that would yield all coefficients as positive would be n t t +7.Notethatthischoicewould be better for 0 <t<2. and, in particular, for t =,thefirstchoiceyields(afterconstrainingn to be a natural number) n 2 while the second one yields n.furtherrefinementsarealsopossible. 949
Monotonicity of entropy and Fisher information: a quick proof via maximal correlation
Communications in Information and Systems Volume 16, Number 2, 111 115, 2016 Monotonicity of entropy and Fisher information: a quick proof via maximal correlation Thomas A. Courtade A simple proof is given
More informationA Criterion for the Compound Poisson Distribution to be Maximum Entropy
A Criterion for the Compound Poisson Distribution to be Maximum Entropy Oliver Johnson Department of Mathematics University of Bristol University Walk Bristol, BS8 1TW, UK. Email: O.Johnson@bristol.ac.uk
More informationPhenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012
Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM
More informationFisher Information, Compound Poisson Approximation, and the Poisson Channel
Fisher Information, Compound Poisson Approximation, and the Poisson Channel Mokshay Madiman Department of Statistics Yale University New Haven CT, USA Email: mokshaymadiman@yaleedu Oliver Johnson Department
More informationFisher information and Stam inequality on a finite group
Fisher information and Stam inequality on a finite group Paolo Gibilisco and Tommaso Isola February, 2008 Abstract We prove a discrete version of Stam inequality for random variables taking values on a
More informationConcavity of entropy under thinning
Concavity of entropy under thinning Yaming Yu Department of Statistics University of California Irvine, CA 92697, USA Email: yamingy@uci.edu Oliver Johnson Department of Mathematics University of Bristol
More informationEntropy Power Inequalities: Results and Speculation
Entropy Power Inequalities: Results and Speculation Venkat Anantharam EECS Department University of California, Berkeley December 12, 2013 Workshop on Coding and Information Theory Institute of Mathematical
More informationYet Another Proof of the Entropy Power Inequality
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 63, NO. 6, JUNE 2017 3595 Yet Another Proof of the Entropy Power Inequality Olivier Rioul, Member, IEEE Abstract Yet another simple proof of the entropy power
More informationA One-to-One Code and Its Anti-Redundancy
A One-to-One Code and Its Anti-Redundancy W. Szpankowski Department of Computer Science, Purdue University July 4, 2005 This research is supported by NSF, NSA and NIH. Outline of the Talk. Prefix Codes
More informationLog-concavity and the maximum entropy property of the Poisson distribution
Log-concavity and the maximum entropy property of the Poisson distribution arxiv:math/0603647v2 [math.pr] 11 Oct 2006 Oliver Johnson February 2, 2008 Abstract We prove that the Poisson distribution maximises
More informationMonotonicity, thinning and discrete versions of the Entropy Power Inequality
1 Monotonicity, thinning and discrete versions of the Entropy Power Inequality Oliver Johnson and Yaming Yu Member, IEEE arxiv:0909.0641v2 [cs.it] 15 Jul 2010 Abstract We consider the entropy of sums of
More informationOn the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method
On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel ETH, Zurich,
More informationStrong log-concavity is preserved by convolution
Strong log-concavity is preserved by convolution Jon A. Wellner Abstract. We review and formulate results concerning strong-log-concavity in both discrete and continuous settings. Although four different
More informationJournal of Inequalities in Pure and Applied Mathematics
Journal of Inequalities in Pure and Applied Mathematics http://jipamvueduau/ Volume 7, Issue 2, Article 59, 2006 ENTROPY LOWER BOUNDS RELATED TO A PROBLEM OF UNIVERSAL CODING AND PREDICTION FLEMMING TOPSØE
More informationSharp Bounds on the Entropy of the Poisson Law and Related Quantities
1 Sharp Bounds on the Entropy of the Poisson Law and Related Quantities José A. Adell, Alberto Leuona, and Yaming Yu, Member, IEEE arxiv:1001.897v1 cs.it] 17 Jan 010 Abstract One of the difficulties in
More informationEntropy and the Additive Combinatorics of Probability Densities on LCA groups
Entropy and the Additive Combinatorics of Probability Densities on LCA groups Mokshay Madiman University of Delaware Based on joint work with Ioannis Kontoyiannis, Athens Univ. of Economics Jiange Li,
More informationA concavity property for the reciprocal of Fisher information and its consequences on Costa s EPI
A concavity property for the reciprocal of Fisher information and its consequences on Costa s EPI Giuseppe Toscani October 0, 204 Abstract. We prove that the reciprocal of Fisher information of a logconcave
More informationRandom Graphs. EECS 126 (UC Berkeley) Spring 2019
Random Graphs EECS 126 (UC Bereley) Spring 2019 1 Introduction In this note, we will briefly introduce the subject of random graphs, also nown as Erdös-Rényi random graphs. Given a positive integer n and
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationarxiv: v4 [cs.it] 8 Apr 2014
1 On Improved Bounds for Probability Metrics and f-divergences Igal Sason Abstract Derivation of tight bounds for probability metrics and f-divergences is of interest in information theory and statistics.
More informationOn an asymptotic series of Ramanujan
Ramanujan J (2009) 20: 179 188 DOI 10.1007/s11139-009-9169-x On an asymptotic series of Ramanujan Yaming Yu Received: 22 May 2008 / Accepted: 1 April 2009 / Published online: 12 August 2009 The Author(s)
More informationMismatched Estimation in Large Linear Systems
Mismatched Estimation in Large Linear Systems Yanting Ma, Dror Baron, Ahmad Beirami Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 7695, USA Department
More informationAn Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,
More informationAn Extended Fano s Inequality for the Finite Blocklength Coding
An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.
More informationAsymptotic Filtering and Entropy Rate of a Hidden Markov Process in the Rare Transitions Regime
Asymptotic Filtering and Entropy Rate of a Hidden Marov Process in the Rare Transitions Regime Chandra Nair Dept. of Elect. Engg. Stanford University Stanford CA 94305, USA mchandra@stanford.edu Eri Ordentlich
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationMMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only
MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University
More informationBinary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1
5 Conference on Information Sciences and Systems, The Johns Hopkins University, March 6 8, 5 inary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity Ahmed O.
More informationJournal of Inequalities in Pure and Applied Mathematics
Journal of Inequalities in Pure and Applied Mathematics http://jipam.vu.edu.au/ Volume, Issue, Article 5, 00 BOUNDS FOR ENTROPY AND DIVERGENCE FOR DISTRIBUTIONS OVER A TWO-ELEMENT SET FLEMMING TOPSØE DEPARTMENT
More informationMonotonicity and Aging Properties of Random Sums
Monotonicity and Aging Properties of Random Sums Jun Cai and Gordon E. Willmot Department of Statistics and Actuarial Science University of Waterloo Waterloo, Ontario Canada N2L 3G1 E-mail: jcai@uwaterloo.ca,
More informationEE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16
EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers
More informationOn the Complete Monotonicity of Heat Equation
[Pop-up Salon, Maths, SJTU, 2019/01/10] On the Complete Monotonicity of Heat Equation Fan Cheng John Hopcroft Center Computer Science and Engineering Shanghai Jiao Tong University chengfan@sjtu.edu.cn
More informationLarge Deviations Performance of Knuth-Yao algorithm for Random Number Generation
Large Deviations Performance of Knuth-Yao algorithm for Random Number Generation Akisato KIMURA akisato@ss.titech.ac.jp Tomohiko UYEMATSU uematsu@ss.titech.ac.jp April 2, 999 No. AK-TR-999-02 Abstract
More informationOn the Secrecy Capacity of the Z-Interference Channel
On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer
More informationOn Improved Bounds for Probability Metrics and f- Divergences
IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Improved Bounds for Probability Metrics and f- Divergences Igal Sason CCIT Report #855 March 014 Electronics Computers Communications
More informationA reverse entropy power inequality for log-concave random vectors
A reverse entropy power inequality for log-concave random vectors Keith Ball, Piotr Nayar and Tomasz Tkocz 8/09/205 Abstract We prove that the exponent of the entropy of one dimensional projections of
More informationShannon entropy in generalized order statistics from Pareto-type distributions
Int. J. Nonlinear Anal. Appl. 4 (203 No., 79-9 ISSN: 2008-6822 (electronic http://www.ijnaa.semnan.ac.ir Shannon entropy in generalized order statistics from Pareto-type distributions B. Afhami a, M. Madadi
More informationTight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding
APPEARS IN THE IEEE TRANSACTIONS ON INFORMATION THEORY, FEBRUARY 015 1 Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding Igal Sason Abstract Tight bounds for
More information(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute
ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationBounds for entropy and divergence for distributions over a two-element set
Bounds for entropy and divergence for distributions over a two-element set Flemming Topsøe Department of Mathematics University of Copenhagen, Denmark 18.1.00 Abstract Three results dealing with probability
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationOn the Shamai-Laroia Approximation for the Information Rate of the ISI Channel
On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationConjugate Predictive Distributions and Generalized Entropies
Conjugate Predictive Distributions and Generalized Entropies Eduardo Gutiérrez-Peña Department of Probability and Statistics IIMAS-UNAM, Mexico Padova, Italy. 21-23 March, 2013 Menu 1 Antipasto/Appetizer
More informationOn some special cases of the Entropy Photon-Number Inequality
On some special cases of the Entropy Photon-Number Inequality Smarajit Das, Naresh Sharma and Siddharth Muthukrishnan Tata Institute of Fundamental Research December 5, 2011 One of the aims of information
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationCorrelation Detection and an Operational Interpretation of the Rényi Mutual Information
Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum
More informationA New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality
0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California
More informationJensen-Shannon Divergence and Hilbert space embedding
Jensen-Shannon Divergence and Hilbert space embedding Bent Fuglede and Flemming Topsøe University of Copenhagen, Department of Mathematics Consider the set M+ 1 (A) of probability distributions where A
More informationOn Some Estimates of the Remainder in Taylor s Formula
Journal of Mathematical Analysis and Applications 263, 246 263 (2) doi:.6/jmaa.2.7622, available online at http://www.idealibrary.com on On Some Estimates of the Remainder in Taylor s Formula G. A. Anastassiou
More informationarxiv: v4 [cs.it] 17 Oct 2015
Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology
More informationEstimation in Gaussian Noise: Properties of the Minimum Mean-Square Error
Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error Dongning Guo, Yihong Wu, Shlomo Shamai (Shitz), and Sergio Verdú Abstract arxiv:1004.333v1 [cs.it] 0 Apr 010 Consider the minimum
More informationA New Tight Upper Bound on the Entropy of Sums
Article A New Tight Upper Bound on the Entropy of Sums Jihad Fahs * and Ibrahim Abou-Faycal eceived: 18 August 015; Accepted: 14 December 015; Published: 19 December 015 Academic Editor: aúl Alcaraz Martínez
More informationArkansas Tech University MATH 2924: Calculus II Dr. Marcel B. Finan
Arkansas Tech University MATH 2924: Calculus II Dr. Marcel B. Finan 8. Sequences We start this section by introducing the concept of a sequence and study its convergence. Convergence of Sequences. An infinite
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationChannel Dispersion and Moderate Deviations Limits for Memoryless Channels
Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability
More informationMath 0230 Calculus 2 Lectures
Math 00 Calculus Lectures Chapter 8 Series Numeration of sections corresponds to the text James Stewart, Essential Calculus, Early Transcendentals, Second edition. Section 8. Sequences A sequence is a
More informationGuesswork Subject to a Total Entropy Budget
Guesswork Subject to a Total Entropy Budget Arman Rezaee, Ahmad Beirami, Ali Makhdoumi, Muriel Médard, and Ken Duffy arxiv:72.09082v [cs.it] 25 Dec 207 Abstract We consider an abstraction of computational
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationLecture 4: Phase Transition in Random Graphs
Lecture 4: Phase Transition in Random Graphs Radu Balan February 21, 2017 Distributions Today we discuss about phase transition in random graphs. Recall on the Erdöd-Rényi class G n,p of random graphs,
More informationThe Entropy Power Inequality and Mrs. Gerber s Lemma for Groups of order 2 n
The Entrop Power Inequalit and Mrs. Gerber s Lemma for Groups of order 2 n Varun Jog EECS, UC Berkele Berkele, CA-94720 Email: varunjog@eecs.berkele.edu Venkat Anantharam EECS, UC Berkele Berkele, CA-94720
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationPETER HARREMOËS AND CHRISTOPHE VIGNAT
AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY PETER HARREMOËS AND CHRISTOPHE VIGNAT Abstract. I this paper, we prove that the classical Etropy Power Iequality, as derived i the cotiuous case, ca
More informationLecture 2: Discrete Probability Distributions
Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture
More informationHeat Flow Derivatives and Minimum Mean-Square Error in Gaussian Noise
Heat Flow Derivatives and Minimum Mean-Square Error in Gaussian Noise Michel Ledoux University of Toulouse, France Abstract We connect recent developments on Gaussian noise estimation and the Minimum Mean-Square
More informationEE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018
Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code
More informationECE598: Information-theoretic methods in high-dimensional statistics Spring 2016
ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma
More informationFixed-Length-Parsing Universal Compression with Side Information
Fixed-ength-Parsing Universal Compression with Side Information Yeohee Im and Sergio Verdú Dept. of Electrical Eng., Princeton University, NJ 08544 Email: yeoheei,verdu@princeton.edu Abstract This paper
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationBipartite decomposition of random graphs
Bipartite decomposition of random graphs Noga Alon Abstract For a graph G = (V, E, let τ(g denote the minimum number of pairwise edge disjoint complete bipartite subgraphs of G so that each edge of G belongs
More informationMath Camp II. Calculus. Yiqing Xu. August 27, 2014 MIT
Math Camp II Calculus Yiqing Xu MIT August 27, 2014 1 Sequence and Limit 2 Derivatives 3 OLS Asymptotics 4 Integrals Sequence Definition A sequence {y n } = {y 1, y 2, y 3,..., y n } is an ordered set
More informationTowards control over fading channels
Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti
More informationLecture 7: February 6
CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 7: February 6 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They
More informationA proof of a partition conjecture of Bateman and Erdős
proof of a partition conjecture of Bateman and Erdős Jason P. Bell Department of Mathematics University of California, San Diego La Jolla C, 92093-0112. US jbell@math.ucsd.edu 1 Proposed Running Head:
More information3F1 Information Theory, Lecture 1
3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:
More informationMinimax Redundancy for Large Alphabets by Analytic Methods
Minimax Redundancy for Large Alphabets by Analytic Methods Wojciech Szpankowski Purdue University W. Lafayette, IN 47907 March 15, 2012 NSF STC Center for Science of Information CISS, Princeton 2012 Joint
More informationSteiner s formula and large deviations theory
Steiner s formula and large deviations theory Venkat Anantharam EECS Department University of California, Berkeley May 19, 2015 Simons Conference on Networks and Stochastic Geometry Blanton Museum of Art
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationSum-Rate Capacity of Poisson MIMO Multiple-Access Channels
1 Sum-Rate Capacity of Poisson MIMO Multiple-Access Channels Ain-ul-Aisha, Lifeng Lai, Yingbin Liang and Shlomo Shamai (Shitz Abstract In this paper, we analyze the sum-rate capacity of two-user Poisson
More informationMAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing
MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing Afonso S. Bandeira April 9, 2015 1 The Johnson-Lindenstrauss Lemma Suppose one has n points, X = {x 1,..., x n }, in R d with d very
More informationTight Bounds on Minimum Maximum Pointwise Redundancy
Tight Bounds on Minimum Maximum Pointwise Redundancy Michael B. Baer vlnks Mountain View, CA 94041-2803, USA Email:.calbear@ 1eee.org Abstract This paper presents new lower and upper bounds for the optimal
More informationExtreme Value for Discrete Random Variables Applied to Avalanche Counts
Extreme Value for Discrete Random Variables Applied to Avalanche Counts Pascal Alain Sielenou IRSTEA / LSCE / CEN (METEO FRANCE) PostDoc (MOPERA and ECANA projects) Supervised by Nicolas Eckert and Philippe
More informationCompressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery
Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad
More informationInformation Theory and Hypothesis Testing
Summer School on Game Theory and Telecommunications Campione, 7-12 September, 2014 Information Theory and Hypothesis Testing Mauro Barni University of Siena September 8 Review of some basic results linking
More informationEfficient Alphabet Partitioning Algorithms for Low-complexity Entropy Coding
Efficient Alphabet Partitioning Algorithms for Low-complexity Entropy Coding Amir Said (said@ieee.org) Hewlett Packard Labs, Palo Alto, CA, USA Abstract We analyze the technique for reducing the complexity
More informationENTROPY VERSUS VARIANCE FOR SYMMETRIC LOG-CONCAVE RANDOM VARIABLES AND RELATED PROBLEMS
ENTROPY VERSUS VARIANCE FOR SYMMETRIC LOG-CONCAVE RANDOM VARIABLES AND RELATED PROBLEMS MOKSHAY MADIMAN, PIOTR NAYAR, AND TOMASZ TKOCZ Abstract We show that the uniform distribution minimises entropy among
More informationWE study the capacity of peak-power limited, single-antenna,
1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power
More informationH(X) = plog 1 p +(1 p)log 1 1 p. With a slight abuse of notation, we denote this quantity by H(p) and refer to it as the binary entropy function.
LECTURE 2 Information Measures 2. ENTROPY LetXbeadiscreterandomvariableonanalphabetX drawnaccordingtotheprobability mass function (pmf) p() = P(X = ), X, denoted in short as X p(). The uncertainty about
More informationOn the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection
On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection ustafa Cenk Gursoy Department of Electrical Engineering University of Nebraska-Lincoln, Lincoln, NE 68588 Email: gursoy@engr.unl.edu
More informationStochastic Comparisons of Order Statistics from Generalized Normal Distributions
A^VÇÚO 1 33 ò 1 6 Ï 2017 c 12 Chinese Journal of Applied Probability and Statistics Dec. 2017 Vol. 33 No. 6 pp. 591-607 doi: 10.3969/j.issn.1001-4268.2017.06.004 Stochastic Comparisons of Order Statistics
More informationAverage Coset Weight Distributions of Gallager s LDPC Code Ensemble
1 Average Coset Weight Distributions of Gallager s LDPC Code Ensemble Tadashi Wadayama Abstract In this correspondence, the average coset eight distributions of Gallager s LDPC code ensemble are investigated.
More informationInformation Measure Estimation and Applications: Boosting the Effective Sample Size from n to n ln n
Information Measure Estimation and Applications: Boosting the Effective Sample Size from n to n ln n Jiantao Jiao (Stanford EE) Joint work with: Kartik Venkat Yanjun Han Tsachy Weissman Stanford EE Tsinghua
More informationBootstrap Percolation on Periodic Trees
Bootstrap Percolation on Periodic Trees Milan Bradonjić Iraj Saniee Abstract We study bootstrap percolation with the threshold parameter θ 2 and the initial probability p on infinite periodic trees that
More informationOn the Applications of the Minimum Mean p th Error to Information Theoretic Quantities
On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Outline Notation
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationSTOCHASTIC GEOMETRY BIOIMAGING
CENTRE FOR STOCHASTIC GEOMETRY AND ADVANCED BIOIMAGING 2018 www.csgb.dk RESEARCH REPORT Anders Rønn-Nielsen and Eva B. Vedel Jensen Central limit theorem for mean and variogram estimators in Lévy based
More information