A new converse in rate-distortion theory
|
|
- Suzanna Bryan
- 5 years ago
- Views:
Transcription
1 A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds applicable to lossy source coding as well as joint sourcechannel coding, which are tight enough not only to prove the strong converse, but to find the rate-dispersion functions in both setups. In order to state the converses, we introduce the d-tilted information, a random variable whose expectation and variance (with respect to the source) are equal to the rate-distortion and rate-dispersion functions, respectively. Index Terms Converse, finite blocklength regime, joint source-channel coding, lossy source coding, memoryless sources, rate-distortion theory, Shannon theory. I. INTRODUCTION The fundamental problem in non-asymptotic rate-distortion theory is to estimate the minimum achievable source coding rate at any given blocklength k compatible with allowable distortion d. Similarly, in joint source-channel coding, channel blocklength n and distortion d are fixed, and one is interested in the maximum number of source samples k per channel block. While in certain special cases these non-asymptotic fundamental limits can be calculated precisely, in general no such formulas exist. Fortunately, non-asymptotic computable upper and lower bounds to the optimum code size can be obtained. This paper deals with converse, or impossibility, bounds that the rate of all codes sustaining a given fidelity level must satisfy. The rest of the paper is organized as follows. Section II introduces the notion of d-tilted information. Sections III and IV focus on source coding and joint source-channel coding, respectively, while Section V discusses and generalizes the results of Sections III and IV. Section VI investigates the asymptotic analysis of the new converse bounds. Section VII considers lossy compression with side information at both compressor and decompressor. Denote by II. d-tilted INFORMATION ı S;Z (s; z) = log dp Z S=s(z) () dp Z (z) the information density of the joint distribution P SZ at (s, z) M ˆM. We can define the right side of () for a given (P Z S,P Z ) even if there is no P S such that the marginal of P S P Z S is P Z. We use the same notation ı S;Z for that more This work was supported in part by the National Science Foundation (NSF) under Grant CCF-0665 and by the Center for Science of Information (CSoI), an NSF Science and Technology Center, under Grant CCF The first author was supported in part by the Natural Sciences and Engineering Research Council of Canada. general function. Further, for a discrete random variable S, the information in outcome s is denoted by ı S (s) = log () P S (s) Under appropriate conditions, the number of bits that it takes to represent s divided by ı S (s) converges to as these quantities go to infinity. Note that if S is discrete, then ı S;S (s; s) =ı S (s). For a given P S and a distortion measure d: M ˆM 0, +, denote R S (d) = inf P Z S : I(S; Z) (3) Ed(S,Z) d We impose the following basic restrictions on the source and the distortion measure: (a) R S (d) is finite for some d, i.e. d min <, where d min =inf{d: R S (d) < (4) (b) The infimum in (3) is achieved by a unique P Z S, and the distortion measure is finite-valued. The counterpart of () in lossy data compression, which in a certain (asymptotic) sense corresponds to the number of bits one needs to spend to encode s within distortion d, is the following. Definition (d tilted information). For d>d min, the d-tilted information in s is defined as j S (s, d) = log E exp {λ d λ d(s, Z (5) ) where the expectation is with respect to the unconditional distribution of Z, and λ = R S(d) (6) A general proof of the following properties can be found using, Lemma.4. Property. For P Z -almost every z, j S (s, d) =ı S;Z (s; z)+λ d(s, z) λ d (7) hence the name we adopted in Definition. Property. R S (d) =E j S (S, d) (8) Property 3. For all z ˆM, E exp {j S (S, d)+λ d λ d(s, z) (9) Restriction (b) is imposed for clarity of presentation. We will show in Section V that it can be dispensed with.
2 with equality for P Z -almost every z. Remark. Using Property, it is easy to show that restriction (b) guarantees differentiability of R S (d), thus (6) is well defined. Indeed, assume by contradiction that R S (d) has two tangents at point d with slopes λ λ. Writing j S, (s, d) and j S, (s, d) for the corresponding d-tilted informations in (5), we invoke (7) to write ı S;Z (s, z) =j S, (s, d)+λ d(s, z) λ d (0) = j S, (s, d)+λ d(s, z) λ d () which implies that d(s, z) does not depend on z. But that means, via (7), that ı S;Z (s; z) does not depend on z, which leads to P S Z = P S,orR S (d) = 0. Thus R S (d) can be non-differentiable only at the point where it vanishes. Remark. While Definition does not cover the case d = d min, for discrete random variables with d(s, z) ={s z it is natural to define 0-tilted information as j S (s, 0) = ı S (s) () Example. Consider the stationary binary memoryless source (BMS) with P S () = p P S (0) and bit error rate distortion, d(s k,z k )= k k i= {s i z i. It is well known that P S Z ( 0) = P S Z (0 ) = d, and a simple calculation using (7) and λ = k log d d yields, for 0 d<p, k j S k(s k,d)= ı S (s i ) kh(d) (3) i= = w log p +(k w)log kh(d) (4) p where w is the number of ones in s k, and j S k(s k,d)=0if d p. Example. For the stationary Gaussian memoryless source (GMS) with variance σ and mean-square error distortion, j S k(s k,d)= k ( σ s k log d + ) log e σ k (5) if 0 <d<σ, and 0 if d σ. The distortion d-ball around s is denoted by B d (s) ={z ˆM: d(s, z) d (6) The d-tilted information is closely related to the (unconditional) probability that Z falls within distortion d from S. Indeed, since λ > 0, for an arbitrary P Z we have by Markov s inequality, P Z (B d (s)) = P d(s, Z) d (7) E exp {λ d λ d(s, Z) (8) where the probability measure is generated by the unconditional distribution of Z. Thus log P Z (B d (s)) j S(s, d) (9) The lossy AEP 3 guarantees that under certain regularity conditions equality in (9) can be closely approached. We denote the Euclidean norm by, i.e. s n = s s n. III. LOSSY SOURCE CODING In fixed-length lossy compression, the output of a general source with alphabet M and source distribution P S is mapped to one of the M codewords from the reproduction alphabet ˆM. A lossy code is a (possibly randomized) pair of mappings f : M {,...,M and c: {,...,M ˆM. Let us introduce the following operational definition. Definition. An (M,d,ɛ) code for {M, ˆM, PS, d is a code with f = M such that P d (S, c(f(s))) >d ɛ. Note that in the conventional fixed-to-fixed (or block) setting M and ˆM are the k fold Cartesian products of alphabets S and Ŝ, and an (M,d,ɛ) code for such a setting is called an (k, M, d, ɛ) code. We are now ready to state our nonasymptotic converse bound for lossy compression. Theorem (Converse, source coding). Fix d>d min. Any (M,d,ɛ) code must satisfy ɛ sup {P j S (S, d) log M + γ exp( γ) (0) Proof: Let the compressor and the decompressor be the random transformations P U S and P Z U, respectively, where U takes values in {,...,M. Let Q U be equiprobable on {,...,M, and let Z be independent of S, with its distribution equal to the marginal of P Z U Q U. We have, for any γ 0 P j S (S, d) log M + γ () = P j S (S, d) log M + γ,d(s, Z) >d + P j S (S, d) log M + γ,d(s, Z) d () ɛ + P M exp (j S (S, d) γ),d(s, Z) d (3) = ɛ + P M exp (j S (S, d) γ){d(s, Z) d (4) ɛ + M E exp (j S(S, d)) {d(s, Z) d (5) ɛ + M E exp (j S(S, d)+λ d λ d(s, Z)) (6) M = ɛ + M E P U S (u S) u= E exp (j S (S, d)+λ d λ d(s, Z)) U = u, S (7) M ɛ +exp( γ) E M u= E exp (j S (S, d)+λ d λ d(s, Z)) U = u, S (8) = ɛ +exp( γ) E exp ( j S (S, d)+λ d λ d(s, Z) ) (9) ɛ +exp( γ) (30) where (5) is by Markov s inequality, (8) uses P U S (u s), (9) is by the definition of Z, (30) uses (9) averaged with respect to P Z.
3 For lossy compression of a binary source with bit error rate distortion, a simple particularization of Theorem using (4) leads to the following easily computable bound. Corollary (Converse, source coding, BMS). Consider a BMS with bias p and bit error rate distortion measure. Fix 0 d<p, 0 <ɛ<. For any (k, M, d, ɛ) code, ɛ sup {P g k (W ) log M + γ (3) g k (w) =w log p +(k w)log kh(d) (3) p where W is binomial with success probability p and k degrees of freedom. Similarly, the particularization of Theorem to the Gaussian case using (5) yields the following result. Corollary (Converse, source coding, GMS). Consider a GMS with variance σ and mean-square error distortion. Fix 0 <d<σ and 0 <ɛ<. For any (k, M, d, ɛ) code, ɛ sup {P g k (W ) log M + γ exp( γ) (33) g k (w) = k σ log d + w k log e (34) where W χ k (i.e. chi square distributed with k degrees of freedom). IV. LOSSY JOINT SOURCE-CHANNEL CODING A lossy source-channel code is a (possibly randomized) pair of mappings f : M X and c: Y ˆM, where X and Y are the channel input and output alphabets, respectively. Definition 3. A (d, ɛ) code for {M, X, Y, ˆM, PS,P Y X, d is a source-channel code with P d (S, c(y )) > d ɛ where f(s) =X. The special case d = 0 and d(s, z) = {s z corresponds to almost-lossless compression. If, in addition, P S is equiprobable on an alphabet of cardinality M, a(0,ɛ) code in Definition 3 corresponds to an (M,ɛ) channel code (defined in 4). On the other hand, if the channel is an identity mapping on an alphabet of cardinality M, a(d, ɛ) code in Definition 3 corresponds to an (M,d,ɛ) lossy code (Definition ). In the conventional block setting X and Y are the n fold Cartesian products of alphabets A and B, M and ˆM are the k fold Cartesian products of alphabets S and Ŝ, and a (d, ɛ) code for such a setting is called a (k, n, d, ɛ) code. Theorem can be generalized to the joint source-channel coding setup as follows. Theorem (Converse, joint source-channel coding). Fix d> d min. Any (d, ɛ) code for source S and channel P Y X must satisfy { ɛ inf sup P X S γ>0 { sup sup E γ>0 P Ȳ sup P j S (S, d) ı X; Ȳ (X; Y ) γ P Ȳ (35) inf P j S (S, d) ı X; Ȳ (x; Y ) γ S x X (36) where in (35), S X Y, and in (36), the probability measure is generated by P S P Y X=x, and ı X; Ȳ (x; y) = log dp Y X(y x) (37) dp Ȳ (y) Proof: The encoder and the decoder are the random transformations P X S and P Z Y, respectively. Fix an arbitrary P Ȳ on Y. Let Z be independent of all other random variables, with distribution dp Z(z) = dp Z Y =y (z)dp Ȳ (y). Wehave, P j S (S, d) ı X; Ȳ (X, Y ) γ ɛ + P exp(ı X; Ȳ (X, Y )) exp (j S (S, d) γ){d(s, Z) d (38) ɛ +exp( γ) E exp (j S (S, d)) { d(s, Z) d (39) ɛ +exp( γ) E exp ( j S (S, d)+λ d λ d(s, Z) ) (40) ɛ +exp( γ) (4) where (38) is obtained in the same way as going from () to (4), (39) is by the change of measure argument, and (4) is due to (9). Optimizing over γ>0and P Ȳ to obtain the best possible bound for a given P X S, then choosing P X S that gives the weakest bound, the code-independent converse (35) follows. To show (36), we first weaken (35) as, { ɛ sup sup inf P j S (S, d) ı X; Ȳ (X; Y ) γ γ>0 P P Ȳ X S (4) and then observe that for any P Ȳ, inf P j S (S, d) ı X; Ȳ (X; Y ) γ P X S = E inf P j S (S, d) ı X; Ȳ (x; Y ) γ S x X (43) V. THEOREMS AND : DISCUSSION A. Block coding In the conventional block coding setting, fixing any three parameters in the quartet (k, M, d, ɛ) (in source coding) or (k, n, d, ɛ) (in source-channel coding), Theorems and can be used to compute converse bounds on the fourth parameter.
4 In addition, fixing the rate and d and letting the blocklength go to infinity, one can use Theorems and to obtain strong converses in their respective setups (Section VI-A). In the practically relevant setting in which design parameters d and ɛ are fixed, a simple Gaussian approximation to the optimum finite blocklenghth coding rate can be obtained via the analysis of the bounds in Theorems and using the Berry-Esseen Central Limit Theorem (Section VI-B). B. Almost lossless data compression and transmission Using (), it is easy to check that both Theorem and Theorem still hold in the almost-lossless case, which corresponds to d =0and d(s, z) ={s z. In fact, Theorem gives a pleasing generalization of the almost-lossless data compression converse bound 5, 6, Lemma.3.: ɛ sup {P ı S (S) log M + γ exp( γ) (44) Similarly, applying Theorem to the almost-lossless sourcechannel coding setup, we conclude that any joint sourcechannel code with error probability not exceeding ɛ must satisfy (36) with j S (S, d) replaced by ı S (S), which can be regarded as a generalization of Wolfowitz s converse 7 to nonequiprobable source outputs 5. C. Theorem implies Theorem Theorem can be viewed as a particular case of the result in Theorem. Indeed, if X = Y = {,...,M, P Y X (m m) =, P Ȳ () =... = P Ȳ (M) = M, then ı X;Ȳ (x; Y ) = log M a.s. regardless of x X, and (36) leads to (0). D. Channels with input cost constraints If the channel has input cost constraints, Theorem still holds with the entire channel input alphabet X replaced by the subset of allowable channel inputs F, where F X. E. Theorems and still hold even if condition (b) fails Even if the rate-distortion function is not achieved by any output distribution, the definition of d tilted information can be extended appropriately, so that Theorems and still hold. Instead of imposing restriction (b) of Section II, we require that the following condition holds. (b ) The distortion measure is such that there exists a finite set G ˆM such that E min d(s, z) < (45) z G Then the rate-distortion function admits the following general representation due to Csiszár. Theorem 3 (Alternative representation of R(d) ). Under the basic restrictions (a), (b ), for each d>d min, it holds that R S (d) = max {E α(s) λd (46) α(s), λ where the maximization is over α(s) 0 and λ 0 satisfying the constraint E exp {α(s) λd(s, z) z ˆM (47) Let (α (s), λ ) achieve the maximum in (9) for some d> d min, and define the d tilted information in s by j S (s, d) =α (s) λ d (48) Note that (9), the only property of d tilted information we used in the proof of Theorems and, still holds due to (47), thus both theorems remain true. VI. ASYMPTOTIC ANALYSIS In addition to the basic conditions (a), (b) of Section II, in this section we impose the following restrictions. (i) The source is stationary and memoryless, P S k = P S... P S. (ii) The distortion measure is separable, d(s k,z k )= k k i= d(s i,z i ). (iii) The distortion level satisfies d>d min, where d min is defined in (4). (iv) The supremum sup PX I(X; Y) is attained by a unique P X. (v) The channel is stationary and memoryless, P Y n X n = P Y X... P Y X, either discrete with finite input alphabet, or Gaussian with a maximal power constraint. A. Strong converses The strong converse for source coding on a general alphabet under a distortion constraint was previously obtained in 8. In joint source-channel coding, if the source and the channel are both discrete or both Gaussian, strong converses can be obtained via the error exponent results in 9, 0, however no strong converses as general as that in 8 are known for joint source-channel coding. As we show next, Theorems and lead to very general strong converses in their respective setups. Due to (i) and (ii), P Z = P k Z... P Z, and the d tilted information single-letterizes, that is, for a.e. s k, j S k(s k,d)= k j S (s i,d) (49) In the lossy compression setup, weakening (0) by choosing γ = kτ for some fixed τ>0and choosing log M = kr(d) γ, we obtain { k ɛ P j S (S i,d) kr(d) kτ exp( kτ) (50) i= i= Using (8), we conclude that the right side of (50) tends to as k increases by the law of large numbers. In the joint source-channel coding setup, we choose γ as above, choose P Ȳ = P Y, where P Y is the channel output distribution generated by the capacity-achieving input distribution P X, let P Y n = P Y... P Y, and for each n,
5 choose k so that kr(d) =nc +3γ. Then, (36) weakens to ɛ E inf x n A n P k j S (S i,d) i= n ı X;Y (x i ; Y i ) kτ S k j= exp ( kτ) (5) n inf ı X;Y (x i ; Y i ) nc + kτ x n A n P j= k P j S (S i,d) nc +kτ exp ( kτ) (5) i= Recalling (8) and E ı X;Y (x, Y) X = x C with equality for P X -a.e. x, we conclude using the law of large numbers that the right side of (5) tends to as k, n for a fixed ratio k n > C R(d). B. Gaussian approximation analysis In addition to restrictions (i)-(v), we assume that the following holds. (vi) The excess-distortion probability satisfies 0 <ɛ<. (vii) The random variable j S (S,d) has finite absolute third moment. Theorem 4 (Gaussian approximation, source coding). Under restrictions (i) (iii), (vi) (vii), any (k, M, d, ɛ) code satisfies log M kr(d)+ kv(d)q (ɛ) log k + O () (53) where V(d) is the source rate-dispersion function given by V(d) =Varj S (S,d) (54) Proof: Consider the case V(d) > 0, so that E j S (S,d) R(d) 3 B =6 (55) Var j S (S,d) 3 is finite according to restriction (vii), and Berry-Esseen bound, Ch. XVI.5 Theorem applies to k i= j S(S i,d). Let γ = log k in (0), and choose log M = kr(d)+ kv(d)q (ɛ k ) γ (56) ɛ k = ɛ +exp( γ)+ B k (57) so that log M can be written as the right side of (53). Substituting (49) and (56) in (0), we conclude that for any (M,d,ɛ ) code it must hold that k ɛ P j S (S i,d) kr(d)+ kv(d)q (ɛ k ) i= exp( γ) (58) The proof for V(d) > 0 is complete upon noting that the right side of (58) is lower bounded by ɛ by the Berry-Esseen bound in view of (57). If V(d) =0, it follows that j S (S,d)=R(d) almost surely. Choosing γ =log ɛ and log M = kr(d) γ in (0) it is obvious that ɛ ɛ. Using Theorem, the following generalization of Theorem 4 can be shown for joint source-channel coding. Theorem 5 (Gaussian approximation, joint source-channel coding). Under restrictions (i) (vii), any (k, n, d, ɛ) code must satisfy nc kr(d) nv + kv(d)q (ɛ)+o (log(n + k)) (59) where V is the channel dispersion 4 given by V =Varı X ;Y (X ; Y ) (60) and V(d) is the source dispersion given by (54). Matching achievability results can be proven, 3 ensuring that the reverse inequalities in (53) and (59) also hold (up to the O (log k) term). VII. LOSSY SOURCE CODING WITH SIDE INFORMATION The results of Sections III and VI can be easily generalized to a setting in which both the compressor and the decompressor have access to side information Y Y which is statistically dependent on the source outcome S M. A lossy code with side information at the compressor and the decompressor is a (possibly randomized) pair of mappings f : M Y {,...,M and c: {,...,M Y ˆM. Definition 4. An (M,d,ɛ) code for {M, Y, ˆM, PSY, d is a code with f = M such that P d (S, c(f(s, Y ),Y)) >d ɛ. The notion of tilted information introduced in Section II is easily generalized when side information is available at the compressor and the decompressor. In parallel with (), (), denote dp SZ Y =y ı S;Z Y (s; z y) = log (s, z) d(p S Y =y P Z Y =y ) (6) ı S Y (s y) = log P S Y =y (s) (6) For a given P SY and a distortion measure d: M ˆM 0, +, denote R S Y (d) = inf P Z S,Y : M Y ˆM Ed(S,Z) d I(S; Z Y ) (63) Under the same basic restrictions (a) and (b) in Section II (replacing R S (d) by R S Y (d) and P Z S by P Z SY ), Definition is generalized as follows. Definition 5 (d tilted information with side information). For d>d min, the d-tilted information in s with side information y at the compressor and the decompressor is defined as j S Y (s, d y) = log E exp {λ d λ d(s, Z ) Y = y (64)
6 where the expectation is with respect to P Z Y =y, and λ = R S Y (d) (65) The counterparts of Properties 3 in Section II are stated below. Property. For P Z Y =y-almost every z, j S Y (s, d y) =ı S;Z Y (s; z y)+λ d(s, z) λ d (66) Property. R S Y (d) =E j S Y (S, d Y ) Property 3. For any P Z Y where S Y Z, E exp { λ d λ d(s, Z)+j S Y (S, d Y ) (67) In parallel with (), for discrete random variables with d(s, z) = {s z we define 0-tilted information with side information at the decompressor as j S Y (s, 0 y) =ı S Y (s y) (68) In the following stationary memoryless examples, P S k = P S... P S and P Y k = P Y... P Y. Example 3. If S and Y are binary equiprobable with bit error rate distortion and P S = Y =0 = P S =0 Y = = p, then j S k Y k(sk,d y k )=ı S k Y k(sk y k ) kh(d) (69) if 0 d<min(p, p), and 0 if d min(p, p). Example 4. Let S and Y are zero-mean jointly Gaussian with variances σs and σ Y and correlation coefficient <ρ<. Denoting the variance of S conditioned on Y by σs Y = σ S ( ρ ),wehave ( ) x k ρ σ S σ Y y k j Sk Y k(sk,d y k )= k log σ S Y d + σ S Y k log e (70) if 0 <d<σs Y, and 0 if d σ S Y. We now state the counterpart of Theorem. Theorem 6 (Converse, source coding with side information). Fix d>d min. Any (M,d,ɛ) code must satisfy { ɛ sup P js Y (S, d Y ) log M + γ exp( γ) (7) Proof: Let the encoder and the decoder be transformations P U SY and P Z UY, where U takes values in {,...,M. Let Ū be equiprobable on {,...,M independent of all other random variables, and let Z be such that S Y Z and P Z Y is the marginal of P Ū P Z UY. We have, for any γ 0 P j S Y (S, d Y ) log M + γ M ɛ + M E P U SY (u S, Y ) u= where (7) is obtained in the same way as going from () to (7), (73) is by P U SY (u, s, y) and the definition of Z, and (74) uses (67). Similar to Theorems and, with the substitution in (68), Theorem 6 also applies to almost-lossless data compression (i.e. d =0and d(s, z) ={s z). In addition, Theorem 6 leads to a strong converse and a counterpart of Theorem 4 for source coding with side information. VIII. CONCLUSION Using the representation of the solution to the rate-distortion minimization problem (3) via the so-called d-tilted information, we have shown nonasymptotic general converse bounds for source coding (Theorem ), joint source-channel coding (Theorem ) and source coding with side information (Theorem 6). The new bounds are tight enough not only to show the strong converse, but to find the corresponding rate-dispersion functions once coupled with the achievability bounds in, 3. REFERENCES S. Verdú and T. S. Han, The role of the asymptotic equipartition property in noiseless source coding, IEEE Transactions on Information Theory, vol. 43, no. 3, pp , 997. I. Csiszár, On an extremum problem of information theory, Studia Scientiarum Mathematicarum Hungarica, vol. 9, no., pp. 57 7, I. Kontoyiannis, Pointwise redundancy in lossy data compression and universal lossy data compression, IEEE Transactions on Information Theory, vol. 46, no., pp. 36 5, Jan Y. Polyanskiy, H. V. Poor, and S. Verdú, Channel coding rate in finite blocklength regime, IEEE Transactions on Information Theory, vol. 56, no. 5, pp , May S. Verdú, ELE58: Information theory lecture notes, Princeton University, T. S. Han, Information spectrum methods in information theory. Springer, Berlin, J. Wolfowitz, Notes on a general strong converse, Information and Control, vol., no., pp. 4, J. Kieffer, Strong converses in source coding relative to a fidelity criterion, IEEE Transactions on Information Theory, vol. 37, no., pp. 57 6, Mar I. Csiszár, On the error exponent of source-channel transmission with a distortion threshold, IEEE Transactions on Information Theory, vol. 8, no. 6, pp , Y. Zhong, F. Alajaji, and L. Campbell, Joint source channel coding excess distortion exponent for some memoryless continuous-alphabet systems, IEEE Transactions on Information Theory, vol. 55, no. 3, pp , 009. V. Kostina and S. Verdú, Fixed-length lossy compression in the finite blocklength regime, IEEE Transactions on Information Theory, to appear, 0. W. Feller, An introduction to probability theory and its applications, nd ed. John Wiley & Sons, 97, vol. II. 3 V. Kostina and S. Verdú, Lossy joint source-channel coding in the finite blocklength regime, in IEEE International Symposium on Information Theory, Cambridge, MA, to appear, Jul. 0. E exp ( j S Y (S, d Y ) ) {d(s, Z) d S, Y (7) ɛ +exp( γ) E exp ( j S Y (S, d Y )+λ ) d d(s, Z) (73) = ɛ +exp( γ) (74)
Channels with cost constraints: strong converse and dispersion
Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse
More informationTransmitting k samples over the Gaussian channel: energy-distortion tradeoff
Transmitting samples over the Gaussian channel: energy-distortion tradeoff Victoria Kostina Yury Polyansiy Sergio Verdú California Institute of Technology Massachusetts Institute of Technology Princeton
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationChannel Dispersion and Moderate Deviations Limits for Memoryless Channels
Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability
More informationRényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 8, AUGUST 2010 3721 Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression Yihong Wu, Student Member, IEEE, Sergio Verdú,
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationSequential prediction with coded side information under logarithmic loss
under logarithmic loss Yanina Shkel Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Maxim Raginsky Department of Electrical and Computer Engineering Coordinated Science
More informationMemory in Classical Information Theory: A Brief History
Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)
More informationarxiv: v4 [cs.it] 17 Oct 2015
Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationPointwise Redundancy in Lossy Data Compression and Universal Lossy Data Compression
Pointwise Redundancy in Lossy Data Compression and Universal Lossy Data Compression I. Kontoyiannis To appear, IEEE Transactions on Information Theory, Jan. 2000 Last revision, November 21, 1999 Abstract
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationThe Information Lost in Erasures Sergio Verdú, Fellow, IEEE, and Tsachy Weissman, Senior Member, IEEE
5030 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 11, NOVEMBER 2008 The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, Tsachy Weissman, Senior Member, IEEE Abstract We consider sources
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationAn Extended Fano s Inequality for the Finite Blocklength Coding
An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.
More informationOptimal Lossless Compression: Source Varentropy and Dispersion
Optimal Lossless Compression: Source Varentropy and Dispersion Ioannis Kontoyiannis Department of Informatics Athens U. of Economics & Business Athens, Greece Email: yiannis@aueb.gr Sergio Verdú Department
More informationlossless, optimal compressor
6. Variable-length Lossless Compression The principal engineering goal of compression is to represent a given sequence a, a 2,..., a n produced by a source as a sequence of bits of minimal possible length.
More informationUniversality of Logarithmic Loss in Lossy Compression
Universality of Logarithmic Loss in Lossy Compression Albert No, Member, IEEE, and Tsachy Weissman, Fellow, IEEE arxiv:709.0034v [cs.it] Sep 207 Abstract We establish two strong senses of universality
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationOn Third-Order Asymptotics for DMCs
On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs
More informationAdvanced Topics in Information Theory
Advanced Topics in Information Theory Lecture Notes Stefan M. Moser c Copyright Stefan M. Moser Signal and Information Processing Lab ETH Zürich Zurich, Switzerland Institute of Communications Engineering
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More information1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE Source Coding, Large Deviations, and Approximate Pattern Matching
1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 Source Coding, Large Deviations, and Approximate Pattern Matching Amir Dembo and Ioannis Kontoyiannis, Member, IEEE Invited Paper
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationLossy Compression Coding Theorems for Arbitrary Sources
Lossy Compression Coding Theorems for Arbitrary Sources Ioannis Kontoyiannis U of Cambridge joint work with M. Madiman, M. Harrison, J. Zhang Beyond IID Workshop, Cambridge, UK July 23, 2018 Outline Motivation
More informationLossy data compression: nonasymptotic fundamental limits
Lossy data compression: nonasymptotic fundamental limits Victoria Kostina A Dissertation Presented to the Faculty of Princeton University in Candidacy for the Degree of Doctor of Philosophy Recommended
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationArimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract
Arimoto-Rényi Conditional Entropy and Bayesian M-ary Hypothesis Testing Igal Sason Sergio Verdú Abstract This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationRate Distortion Function For a Class of Relative Entropy Sources
Proceedings of the 9th International Symposium on Mathematical Theory of Networks and Systems MTNS 200 5 9 July, 200 Budapest, Hungary Rate Distortion Function For a Class of Relative Entropy Sources Farzad
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More information(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute
ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html
More informationEECS 750. Hypothesis Testing with Communication Constraints
EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.
More informationA Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources
A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College
More information1 Background on Information Theory
Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by
More informationWITH advances in communications media and technologies
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 6, SEPTEMBER 1999 1887 Distortion-Rate Bounds for Fixed- Variable-Rate Multiresolution Source Codes Michelle Effros, Member, IEEE Abstract The source
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationECE598: Information-theoretic methods in high-dimensional statistics Spring 2016
ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma
More informationA New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality
0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California
More informationA Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,
More informationSecond-Order Asymptotics for the Gaussian MAC with Degraded Message Sets
Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 12, DECEMBER
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 53, NO 12, DECEMBER 2007 4457 Joint Source Channel Coding Error Exponent for Discrete Communication Systems With Markovian Memory Yangfan Zhong, Student Member,
More informationInformation-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery
Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)
More informationPattern Matching and Lossy Data Compression on Random Fields
Pattern Matching and Lossy Data Compression on Random Fields I. Kontoyiannis December 16, 2002 Abstract We consider the problem of lossy data compression for data arranged on twodimensional arrays (such
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationMidterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016
Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationAsymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationGuesswork Subject to a Total Entropy Budget
Guesswork Subject to a Total Entropy Budget Arman Rezaee, Ahmad Beirami, Ali Makhdoumi, Muriel Médard, and Ken Duffy arxiv:72.09082v [cs.it] 25 Dec 207 Abstract We consider an abstraction of computational
More informationLin Song Shuo Shao Jun Chen. 11 September 2013
and Result in Song Shuo Shao 11 September 2013 1 / 33 Two s and Result n S Encoder 1: f1 R 1 Decoder: Decoder: g 1 g1,2 d 1 1 1,2 d1,2 Encoder 2: f Encoder 2 R 2 Decoder: g 2 Decoder d 2 2 2 / 33 Two s
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationSource and Channel Coding for Correlated Sources Over Multiuser Channels
Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationAfundamental component in the design and analysis of
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 533 High-Resolution Source Coding for Non-Difference Distortion Measures: The Rate-Distortion Function Tamás Linder, Member, IEEE, Ram
More information3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014
3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014 The Lossy Common Information of Correlated Sources Kumar B. Viswanatha, Student Member, IEEE, Emrah Akyol, Member, IEEE, and Kenneth
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationJoint Universal Lossy Coding and Identification of Stationary Mixing Sources With General Alphabets Maxim Raginsky, Member, IEEE
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 5, MAY 2009 1945 Joint Universal Lossy Coding and Identification of Stationary Mixing Sources With General Alphabets Maxim Raginsky, Member, IEEE Abstract
More information5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006
5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,
More informationSubset Universal Lossy Compression
Subset Universal Lossy Compression Or Ordentlich Tel Aviv University ordent@eng.tau.ac.il Ofer Shayevitz Tel Aviv University ofersha@eng.tau.ac.il Abstract A lossy source code C with rate R for a discrete
More informationVariable-Rate Universal Slepian-Wolf Coding with Feedback
Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationOn MMSE Estimation from Quantized Observations in the Nonasymptotic Regime
On MMSE Estimation from Quantized Observations in the Nonasymptotic Regime Jaeho Lee, Maxim Raginsky, and Pierre Moulin Abstract This paper studies MMSE estimation on the basis of quantized noisy observations
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationKey Capacity with Limited One-Way Communication for Product Sources
Key Capacity with Limited One-Way Communication for Product Sources Jingbo Liu Paul Cuff Sergio Verdú Dept. of Electrical Eng., Princeton University, NJ 08544 {jingbo,cuff,verdu}@princeton.edu ariv:1405.7441v1
More informationA New Metaconverse and Outer Region for Finite-Blocklength MACs
A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region
More informationNoisy-Channel Coding
Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.
More informationGraph Coloring and Conditional Graph Entropy
Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More informationVariable Rate Channel Capacity. Jie Ren 2013/4/26
Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationRefined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities
Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC
More informationMMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only
MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationA Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel
A Simple Memoryless Proof of the Capacity of the Exponential Server iming Channel odd P. Coleman ECE Department Coordinated Science Laboratory University of Illinois colemant@illinois.edu Abstract his
More informationMAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)
MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)
More informationShannon and Poisson. sergio verdú
Shannon and Poisson sergio verdú P λ (k) = e λλk k! deaths from horse kicks in the Prussian cavalry. photons arriving at photodetector packets arriving at a router DNA mutations Poisson entropy 3.5 3.0
More informationFixed-Length-Parsing Universal Compression with Side Information
Fixed-ength-Parsing Universal Compression with Side Information Yeohee Im and Sergio Verdú Dept. of Electrical Eng., Princeton University, NJ 08544 Email: yeoheei,verdu@princeton.edu Abstract This paper
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationEE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16
EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt
More informationCapacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel
Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationMAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A
MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC
More informationSimple Channel Coding Bounds
ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,
More informationSubset Source Coding
Fifty-third Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 29 - October 2, 205 Subset Source Coding Ebrahim MolavianJazi and Aylin Yener Wireless Communications and Networking
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More information