New Results on the Equality of Exact and Wyner Common Information Rates
|
|
- Russell Williamson
- 5 years ago
- Views:
Transcription
1 08 IEEE International Symposium on Information Theory (ISIT New Results on the Equality of Exact and Wyner Common Information Rates Badri N. Vellambi Australian National University Acton, ACT 0, Australia Jörg Kliewer New Jersey Institute of Technology Newark, NJ Abstract Recently, Kumar, Li, and El Gamal proposed a notion of common information using a variation of a setup used to define Wyner common information rate. This notion, known as the exact common information, is the minimum common randomness required for the exact and separate generation of a pair of correlated discrete memoryless sources. While exact common information rate is not known to have a single-letter characterization, it was shown to equal the Wyner common information rate for the symmetric binary erasure source in Kumar - Li - El Gamal - ISIT 0. The authors extended this result to establish the equality of the two notions of common information for general noisy typewriter, Z- and erasure sources in Vellambi - Kliewer - Allerton 0. In this work, we investigate the connection between exact and Wyner common information rates to derive two new implicit conditions (on the joint source distribution that ensure the equality of the two notions. I. INTRODUCTION One of the fundamental aims of source coding is to quantify the amount and role of information in various multi-user communication problems. Research efforts into quantifying the information common to two (or more correlated random variables indicate that the amount and nature of information depends on the actual setup and/or application, and that there is no one universal notion of information. The most common notion is Shannon s mutual information, which is the reduction in the entropy of a random variable due to the knowledge of a correlated random variable. Gács and Körner envisaged common information between two discrete memoryless sources (DMSs as the rate of randomness that can be simultaneously extracted from either of the two correlated sources [. It was proven that the Gács-Körner common information between a pair of sources is more restrictive than and distinct from mutual information; this notion of common information plays an important role in the optimal and/or best-known schemes in many multi-user source coding problems [ [. Another well known notion of common information can be traced to Wyner s seminar work [ on the Gray-Wyner problem [5. The Gray-Wyner problem (see Fig. corresponds to the characterization of the rates of communication required to communicate a pair of correlated sources to two receivers with each requiring one of the sources. The Wyner common information rate between the two sources is the smallest communication rate R 0 on the common channel This work was supported by NSF grants CCF-00 and CNS-557. such that the sum rate is kept at its absolute minimum, i.e., R 0 + R + R = H(X, Y. Wyner common information rate is given by the following single-letter characterization. Pr[X n = ˆX n apple ˆX n W (X; Y ˆX n Randomized Processor min I(X, Y ; W. ( X W Y W X Y X n,y n R R Encoder R 0 Decoder Decoder Fig.. The Gray-Wyner setup. Controller D KL Q ˆXn Ŷ n k Q n Pr[Y n = Ŷ n apple Ŷ n Ŷ n R 0 Randomized Processor apple Fig.. Separate generation of correlated sources. In addition to the Gray-Wyner problem setup in [, Wyner used the setup in Fig. to show that the Wyner common information rate W (X; Y is also the smallest rate of a uniform random seed that must be supplied to two independent processors to approximately generate the two DMSs X and Y separately. In this setup, a Kullback-Leibler-divergencebased metric was chosen to quantify the precision of the approximation of the generated sources to the design distribution of the correlated discrete memoryless sources. Owing to this operational interpretation, Wyner common information and the achievability scheme in [ feature commonly in achievable schemes in strong coordination problems, e.g. [7 [9. Other connections of Wyner common information to lossy reconstruction problems, and an extension to multiple random variables were explored in [0, [. It must be remarked that despite the simple formulation in (, explicit formula/value for Wyner common information rate is known only for a few joint probability mass functions (pmfs [. Recently, in [, Kumar et al. proposed the notion of exact common information using the setup in Fig. by requiring the distribution Q ˆXn,Ŷ to equal the n-fold product n of /8/$ IEEE 5
2 08 IEEE International Symposium on Information Theory (ISIT Q, which is the joint pmf of n i.i.d. RVs each distributed according to Q. In other words, exact common information is the smallest rate of a common message that must be shared by two processors to separately generate DMSs Q X and Q Y correlated jointly and precisely according to a given Q. In [, Kumar et al. derived the fundamental properties of exact common information and proved that for the symmetric binary erasure source, the exact and Wyner common information rates coincide. In [, we presented two sufficient conditions under which the notions of exact and Wyner common information coincide. Using these conditions, it was shown that the exact and Wyner common information rates are equal for the binary Z-, the general erasure and the general noisy typewriter sources. In this work, we derive new connections between exact and Wyner common information, and establish two new implicit conditions under which the two notions coincide. II. NOTATION All random variables (RVs in this work are assumed to be over finite alphabets. Given RVs A, B, C, the conditional independence of A and C given B is denoted by A B C. The support of a RV X Q X is denoted by S(Q X. Given a finite set S, S denotes the indicator function on S. For a vector a n A n and ã A, # a n(ã {i : a i = ã}. Given a joint pmf Q AB over A B, we define the following letter-typical sets [5. { } Tε n [Q A a n : sup # a n (ã nq A (a ε, Tε n [Q A B ; b n a n : ã S(Q A sup (ã, b S(Q AB # b n ( b>0 # a n,b n (ã, b # b n ( b Q A B (ã b ε. III. PROBLEM DEFINITION AND KNOWN RESULTS Exact common information E (X; Y is defined via Fig. with the Kullback-Leibler divergence term set to zero instead. Given a joint pmf Q, we say that exact generation is possible at a rate of R if for every ε > 0, there exists an n N, and an RV W n such that X n W n Y n and H(W n n(r + ε. That is, if W n is conveyed to two randomized processors, they can separately generate X n and Y n jointly correlated according to. The exact common information is then the infimum of all such achievable rates. Definition ([: Given pmf Q, the exact common information rate between X and Y is defined to be ( H(W E (X; Y lim inf n n X n W n Y n n. ( Notice that the size of the alphabet of W n is allowed to grow with n, and hence, ( is not a computable form of exact common information. We summarily present three basic properties (Remarks - of the exact common information between two random variables. Details of the proof of these results can be found in [, [. The first property is an ordering of different notions of common information. The position of the exact common information is justified by the fact that while the setups for exact and Wyner common information are identical, the source generation requirement for the former is more stringent. The second remark details the expected result that concatenation cannot decrease exact common information. Lastly, the third parallels data processing inequality and characterizes the monotonicity of exact common information with respect to stochastic degradedness. Remark : [, Prop. Given (X, Y Q, let G (X; Y denote the Gács-Körner common information between RVs X and Y. Then, G (X; Y, I(X; Y, W (X; Y, E (X; Y, and min{h(x, H(Y } is a list of non-decreasing non-negative real numbers. Remark : Let (A, B, C, D Q AB Q CD, X = (A, C and Y = (B, D. Then, E (X; Y max{e (A; B, E (C; D, W (X; Y } ( E (X; Y E (A; B + E (C; D. ( Further, ( holds with equality if E (A; B = W (A; B and E (C; D = W (C; D. Remark : E (X; Y E (X ; Y if X X Y Y. In addition to the above properties, in [, we established the equality of Wyner and exact common information rates provided the pmf Q is such that there either exists: a RV W such that X W Y, I(X, Y ; W = W (X; Y, and H(W X, Y = 0; or a RV W such that I(X, Y ; W = W (X; Y, and H(X W = w H(Y W = w = 0. (5 w W IV. NEW RESULTS This section contains three new results. The first result presents a bound on the exact common information between two random variables based on a decomposition using a third correlated random variable. The next two results present implicit conditions on the pmf Q under which the Wyner and exact common information notions coincide. Theorem : Let (X, Y, G Q X,Y,G. Let for g G, (X g, Y g Q Xg,Y g Q G=g. Then, E (X; Y H(G + g G Q G (ge (X g ; Y g. Further, if the RV G is such that H(G X = H(G Y = 0, and for each g G, E (X g ; Y g = W (X g ; Y g, then E (X; Y = H(G + g G Q G(gE (X g ; Y g = H(G + g G Q G(gW (X g ; Y g = W (X; Y. Proof: Let ε > 0. Let n 0 be a positive integer such that for l > n 0, P[G l / T l ε [Q G ε. Let n > n 0 then be chosen. Let G {g,..., g k }, and for the sake of notational ease, let n l,ε n( + εq G (g l for l =,..., k. Further, define X (n,ε [G n, (X g n,ε,..., (X gk n k,ε, Y (n,ε [G n, (Y g n,ε,..., (Y gk n k,ε, 5
3 08 IEEE International Symposium on Information Theory (ISIT where X (n,ε contains n i.i.d. copies of G, and n i,ε i.i.d. copies of X gi, i =,..., k. Similarly, Y (n,ε contains n i.i.d. copies of G, and n i,ε i.i.d. copies of Y gi, i =,..., k. Let the joint distribution between these two sources be Q X(n,ε,Y (n,ε From Remark, it follows that G k l= l,ε X gl,y gl. ( E (X (n,ε ; Y (n,ε nh(g + l n l,εe (X gl ; Y gl. (7 Suppose that we are given a scheme for generating N symbols of Q X(n,ε,Y (n,ε using a message M N that the controller conveys at no more than N(E (X (n,ε ; Y (n,ε + ε bits conveying a message M N. Since the nn symbols of G are common to both terminals, it must follow that these nn symbols are a function of the common message conveyed by the controller. Using this fact, we can devise a scheme to generate nn copies of Q in the following way. A: If the realization of G nn corresponding to M N is an element of Tε nn [Q G, the X-terminal can generate n l,ε N copies of X gl, l =,..., k, which it can reposition according to the realization of G nn while maintaining the same order of realization for indices with the same g l, l =,..., k. The Y -terminal operates similarly. An illustration for a source with k = is given in Fig.. B: If the realization of G nn corresponding to M N is not in Tε nn [Q G, the controller simply generates nn symbols of the two sources and forwards the realizations to both terminals using at most nn log X Y + bits. X nn Y nn G nn X Nbn(+"QG(gc g X Nbn(+"QG(gc g g g g g g g g g g g g g G nn Reposition (maintaining order Reposition (maintaining order Y Nbn(+"QG(gc g = Possibly Unused Symbols Y Nbn(+"QG(gc g Fig.. Repositioning the realizations of X (n,ε, Y (n,ε according to the generated G nn to generate nn copies of Q. This scheme generates nn copies of the source Q using N copies of Q X(n,ε,Y (n,ε with an additional overhead (due to the atypicality of G nn. Hence, we have ( N(E (X(n,ε ; Y nne (X; Y (n,ε + ε, (8 + ε (nn log X Y + where we have used the fact that P[G nn / Tε nn [Q G ε. Replacing E (X (n,ε ; Y (n,ε with the bound in (7, taking n to infinity, and then ε to zero, we obtain E (X; Y H(G + l Q G(g l E (X gl ; Y gl, (9 which is the upper bound we set to establish. Now, if E (X gl ; Y gl = W (X gl ; Y gl for each component pmf Q Xgl Y gl, then from Remark, we are guaranteed that (9 holds with equality, i.e., for any ε > 0 and n N, E (X (n,ε ; Y (n,ε = nh(g + l n l,εe (X gl ; Y gl. (0 In this setting, we can adapt a scheme for exact generation of Q. Fix ε (0, n 0. We can then find a suitably large m > ε > n 0 and an exact generation scheme for generating m copies of the joint source Q, where the controller conveys a message M m with an average length of no more that m(e (X; Y +ε bits. Since X m M m Y m it must follow that I(X m ; Y m M m = H(G m M m = 0, i.e., the controller knows G m. We will use this fact to devise a scheme for the source Q X(m,ε,Y (m,ε, where m = m( ε +ε. To derive a scheme that outputs one symbol of Q X(m,ε,Y (m,ε, the controller generates a realization of M m. The controller uses the realization to verify if the corresponding realization of G m is ε-letter typical. If so, it conveys the typicality of G m by a bit, and then conveys the realization of M m to the two terminals. The two terminals use the received message and generate X m and Y m respectively. The terminals then identify (at least G m = f(x m = g(y m. Since G m is typical, the terminals can definitely identify m( εq G (g l > m ( + ε copies of X gl and Y gl, l =,..., k, respectively. Employing the repositioning technique in Fig. in the reverse direction, we can then identify one symbol of X (m,ε and Y (m,ε at the two terminals. On the other hand, if G m corresponding to the realization of M m is atypical, the controller conveys the atypicality using a bit, and then generates a realization of (X (m,ε, Y (m,ε and conveys that in no more than m log X Y + bits. The above scheme is then an exact generation scheme for one symbol of Q X(m,ε,Y (m,ε, and hence, E (X (m,ε; Y (m,ε ( + m(e (X; Y + ε +ε(m log X Y +, where we have used P[G m / T m ε [Q G ε. Combining the above with (9, and letting ε 0, we see that E (X; Y H(G + l Q G(g l E (X gl ; Y gl, ( thereby establishing the second part of the claim. Theorem : If Q = S(Q S(Q, then E (X; Y = W (X; Y. ( Proof: Just as in the proof of [, Theorem, we devise a two-stage scheme to exactly match the output statistics of the two sources to. The first stage uses a modified channel resolvability codebook [7, [ so that the joint pmf of the source outputs satisfies a nearly-uniform-type convergence constraint. The second stage then refines the shortcomings of the first to match the exact generation requirement. Pick auxiliary RV W and pmf Q XW Y such that (a I(X, Y ; W = W (X; Y and (b X W Y. To exactly generate the sources, pick ε > 0, and n large so that P [ W n T n ε [Q W > ε, ( 5
4 08 IEEE International Symposium on Information Theory (ISIT and for every w n Tε n [Q W, X n X W ( wn and Y n Y W ( wn, P [ X n T n ε [Q X W ; w n > ε, ( P [ Y n T n ε [Q Y W ; w n > ε. (5 Let l = S(Q. Then, due to the structure of Q, H(X, Y = log l. Let rate R be given by R = I(X, Y ; W + ε X Y ( = log l H(X, Y W + ε X Y. (7 Now, let C {W n (i} nr i= W n (i, i =,..., nr, generated i.i.d. using be a codebook with each codeword defined by n W n W (w n = Q n W (wn T n ε [Q W (w n W (T. (8 ε n [Q W By construction, every codeword in the codebook is ε-strongly letter typical. Now, define a channel n X W ( with input alphabet Tε n [Q W and output alphabet X n by n X W (xn w n = X W (xn w n T n ε [Q X W ;w n (x n X W (T, (9 n ε [Q X W ; w n w n and similarly define the channel n Y W with input alphabet Tε n [Q W and output alphabet Y n by n Y W (yn w n = Y W (yn w n T n ε [Q Y W ;w n (y n Y W (T. (0 n ε [Q Y W ; w n w n For (x n, y n X n Y n, define nr i.i.d. RVs Z i (x n, y n by Z i (x n, y n = n X W (xn W n (i n Y W (yn W n (i, ( and let µ(x n, y n E[Z i (x n, y n. Using the properties of letter-typical sequences, and (-(5, one can show that for any i =,..., nr, and (x n, y n X n Y n, Z i (x n, y n W ( ε n(h(x,y [0, ( ε, ( µ(x n, y n Q n (xn, y n ( ε l n ( ε, ( E[Z i (x n, y n Now, define a (random pmf nh(x,y W ( ε ( ε µ(x n, y n. ( on X n Y n by (xn, y n nr nr i= Z i(x n, y n, which is the joint pmf of the outputs when a randomly selected codeword from the random channel resolvability codebook C is passed through two parallel channels n X W and n Y W, respectively. Unlike in the proof of [, Theorem, the support of the pmfs and E[ = µ need not lie within the Tε[Q n. Instead, we are only guaranteed that S( S(µ (T ε[q n X Tε[Q n Y S(. (5 Thus, it is possible that the realizations of the pair of sources emitted by a random codebook are atypical. However, this will not cause a severe problem because of the specific form of the joint pmf Q. Now, let η ( ε + ε. Then, we see that for any (x n, y n S(µ, (x n, y n η (xn, y n µ(x n, y n ( ε l n, ( Similarly, for any (x n, y n S(µ, η (xn, y n + µ(x n, y n ( η l n. (7 Note that the random pmf is a weighted sum of i.i.d. RVs, and hence one can use standard bounding techniques to establish concentration properties of tail events. Now consider the following tail event for (x n, y n S(µ. [ P (xn, y n µ(x n, y n > (x n, y n nr = P Z i (x n, y n nr µ(x n, y n i= > (xn, y n ( ( exp nr nr (x n, y n var(z (x n, y n n(h(x,y W ( ε (x + n,y n ( ε ( exp ( ( nr nr η (xn, y n µ(x n, y n n(h(x,y W ( ε (η (xn,y n +µ(x n,y n ( ε (,(7 exp [ ( ε ε n(r log l+h(x,y W ( ε η exp [ ( ε ε η nε X Y. (8 Note that we have used Bernstein s inequality [7 to bound the tail event at the third step above. We can then proceed with a union bound over all pairs (x n, y n S(µ as follows. [ ( P (xn, y n η (xn, y n P (x n,y n S( [ (x n,y n S(µ ( (xn, y n η (xn, y n [ ( P (xn, y n µ(x n, y n (x n, y n (x n,y n S(µ S(µ exp [ ( ε ε nε X Y η exp [ ( ε ε nε X Y n log η e X Y, (9 which approaches unity as n diverges. Thus, for sufficiently large n, there exists a codebook C with (xn, y n η (xn, y n for all (x n, y n S(. Then r n η Q n η defines a pmf over X n Y n. Further, is expressible as a convex combination of two pmfs and rn, i.e., = η + ( η r n. (0 5
5 08 IEEE International Symposium on Information Theory (ISIT The above equation essentially yields the required two-stage exact generation scheme. To generate n copies of the X and Y distributed according to Q, the controller first generates an instance of a binary RV V with Q V (0 = η. The controller conveys V to both terminals. If V = 0, the controller additionally generates an instance of ( X n, Ỹ n r n, and conveys them in n log X Y bits. Note that in this case, each terminal knows both source realizations. Now, if V =, the controller generates nr bits uniformly at random, and conveys it to both terminals. The terminals use the bits to identify the appropriate codeword from C, and generate their source realizations using the chosen codeword and the respective channels ( n X W or n Y W. On average, this exact source-generation scheme uses no more than n + η R + ( η ( log X Y + n bits/symbol. By allowing n to grow unbounded and then allowing ε to vanish, we can see that the above quantity approaches the required limit of I(X, Y ; W = W (X; Y. Thus, we can build schemes for separate, exact generation of the pair of sources at rates arbitrarily close to but larger than W (X; Y. When combined with Remark, the claim then follows. Theorem : Suppose Q (when viewed as a matrix consists only of rational entries and that in each row, the nonzero entries are identical (but the non-zero entries in different rows can be distinct. Then E (X; Y = W (X; Y. k i l i Proof: Let be the non-zero entry appearing in the i th row of Q. Let l = lcm{l i : i =,..., X }. Let κ = gcd{k i : i =,... X }. Let m i ki l κ l i N and S i = S(Q Y X=i for i =,..., X. Let X = {,..., X i= m i}. Define a noiseless but random channel Q X X with input alphabet X and output alphabet X such that: (a S(Q X X=i = m i ; (b H(X X = i = log m i ; and (c H(X X = 0. An illustration for such a channel is given in Fig.. Let Q X Q X XQ. Then, by definition, X X Y. Further, since H(X X = 0, it is also true that X X Y. Thus by Remark and its analogue for Wyner common information, it follows that E (X ; Y = E (X; Y, ( W (X ; Y = W (X; Y. ( For m N and S {,..., Y }, let I m,s be a m Y dimensional {0, }-matrix with ones at locations (i, j if and only if j S. Then, by definition, it can be seen that Q X Y κ l I m,s I m,s. I m X,S X. ( The claim then follows from (, (, and by noticing that Q X Y meets the structural requirement of Theorem. Since X and Y are interchangeable, the following also holds. Remark : Suppose Q (when viewed as a matrix consists only of rational entries and that in each column, X 0 Q X0 X X 9 Y X X Q = QX0 Y 0 = I,{,} I,{} I,{} I,{} Fig.. An illustration of the splitting of the source alphabet. the non-zero entries are identical (but the non-zero entries in different columns can differ. Then E (X; Y = W (X; Y. REFERENCES [ P. Gács and J. Körner, Common information is far less than mutual information, Problems of Control and Information Theory, vol., no., pp. 9, 97. [ A. B. Wagner, B. G. Kelly, and Y. Altug, The lossy one-helper conjecture is false, in 7th Annual Allerton Conference on Communication, Control, and Computing, Sept 009, pp [ B. N. Vellambi and R. Timo, The Heegard-Berger problem with common receiver reconstructions, in IEEE Information Theory Workshop (ITW 0, Sept 0, pp. 5. [, Successive refinement with common receiver reconstructions, in 0 IEEE International Symposium on Information Theory, June 0, pp. 8. [5 R. M. Gray and A. D. Wyner, Source coding for a simple network, The Bell System Technical Journal, vol. 5, no. 9, pp. 8 7, Nov 97. [ A. D. Wyner, The common information of two dependent random variables, IEEE Transactions on Information Theory, vol., no., pp. 79, Mar 975. [7 P. Cuff, Communication requirements for generating correlated random variables, in 008 IEEE International Symposium on Information Theory, July 008, pp [8 B. N. Vellambi, J. Kliewer, and M. Bloch, Strong coordination over multi-hop line networks, in 05 IEEE Information Theory Workshop, Oct. 05, pp [9, Strong coordination over a line when actions are Markovian, in 50th Annual Conference on Information Sciences and Systems (CISS, Mar. 05. [0 G. Xu, W. Liu, and B. Chen, A lossy source coding interpretation of Wyner s common information, IEEE Transactions on Information Theory, vol., no., pp , Feb 0. [ K. B. Viswanatha, E. Akyol, and K. Rose, The lossy common information of correlated sources, IEEE Transactions on Information Theory, vol. 0, no., pp. 8 5, June 0. [ H. S. Witsenhausen, Values and bounds for the common information of two discrete random variables, SIAM Journal on Applied Mathematics, vol., no., pp., 97. [ G. R. Kumar, C. T. Li, and A. E. Gamal, Exact common information, in 0 IEEE International Symposium on Information Theory, June 0, pp. 5. [ B. N. Vellambi and J. Kliewer, Sufficient conditions for the equality of exact and Wyner common information, in 5th Annual Allerton Conference on Communication, Control, and Computing, Sep. 0, pp [5 G. Kramer, Topics in multi-user information theory, Found. Trends Commun. Inf. Theory, vol., no. -5, pp. 5, 007. [ T. S. Han and S. Verdú, Approximation theory of output statistics, IEEE Transactions on Information Theory, vol. 9, no., pp , May 99. [7 S. Boucheron, G. Lugosi, P. Massart, and M. Ledoux, Concentration inequalities : a nonasymptotic theory of independence. Oxford University Press, Y 0 55
On Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationA New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality
0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationExtended Gray Wyner System with Complementary Causal Side Information
Extended Gray Wyner System with Complementary Causal Side Information Cheuk Ting Li and Abbas El Gamal Department of Electrical Engineering, Stanford University Email: ctli@stanford.edu, abbas@ee.stanford.edu
More information3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014
3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014 The Lossy Common Information of Correlated Sources Kumar B. Viswanatha, Student Member, IEEE, Emrah Akyol, Member, IEEE, and Kenneth
More informationOptimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels
Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationRefined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities
Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC
More informationMultiuser Successive Refinement and Multiple Description Coding
Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationSimultaneous Nonunique Decoding Is Rate-Optimal
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationOn Scalable Source Coding for Multiple Decoders with Side Information
On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,
More informationOptimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels
Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationOn the Capacity of the Two-Hop Half-Duplex Relay Channel
On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationSource and Channel Coding for Correlated Sources Over Multiuser Channels
Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationOn the Capacity Region for Secure Index Coding
On the Capacity Region for Secure Index Coding Yuxin Liu, Badri N. Vellambi, Young-Han Kim, and Parastoo Sadeghi Research School of Engineering, Australian National University, {yuxin.liu, parastoo.sadeghi}@anu.edu.au
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationSecret Key Agreement Using Asymmetry in Channel State Knowledge
Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationEquivalence for Networks with Adversarial State
Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department
More informationSubset Universal Lossy Compression
Subset Universal Lossy Compression Or Ordentlich Tel Aviv University ordent@eng.tau.ac.il Ofer Shayevitz Tel Aviv University ofersha@eng.tau.ac.il Abstract A lossy source code C with rate R for a discrete
More informationOn The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers
On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone
More informationJoint Source-Channel Coding for the Multiple-Access Relay Channel
Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il
More informationExercises with solutions (Set B)
Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th
More information5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006
5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,
More informationSOURCE coding problems with side information at the decoder(s)
1458 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 59, NO. 3, MARCH 2013 Heegard Berger Cascade Source Coding Problems With Common Reconstruction Constraints Behzad Ahmadi, Student Member, IEEE, Ravi Ton,
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationSource-channel coding for coordination over a noisy two-node network
New Jersey Institute of Technology Digital Commons @ NJIT Theses Theses and Dissertations Fall 2016 Source-channel coding for coordination over a noisy two-node network Sarah A. Obead New Jersey Institute
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationCapacity Region of the Permutation Channel
Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationLecture 1: The Multiple Access Channel. Copyright G. Caire 12
Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user
More informationAn Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationLossy Distributed Source Coding
Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationCapacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel
Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationSubset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding
Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California - Santa Barbara {kumar,eakyol,rose}@ece.ucsb.edu
More informationAchieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel
Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of
More informationThe Capacity Region for Multi-source Multi-sink Network Coding
The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.
More informationPART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015
Outline Codes and Cryptography 1 Information Sources and Optimal Codes 2 Building Optimal Codes: Huffman Codes MAMME, Fall 2015 3 Shannon Entropy and Mutual Information PART III Sources Information source:
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationChain Independence and Common Information
1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than
More informationOn the Rate-Limited Gelfand-Pinsker Problem
On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract
More informationEE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16
EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers
More informationInterference Channel aided by an Infrastructure Relay
Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationHomework Set #2 Data Compression, Huffman code and AEP
Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationIntermittent Communication
Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationDistributed Lossless Compression. Distributed lossless compression system
Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf
More informationCommon Randomness Principles of Secrecy
Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationElectrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7
Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationII. THE TWO-WAY TWO-RELAY CHANNEL
An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationCommon Information and Decentralized Inference with Dependent Observations
Syracuse University SURFACE Electrical Engineering and Computer Science - Dissertations College of Engineering and Computer Science 8-2013 Common Information and Decentralized Inference with Dependent
More informationInformation Theory CHAPTER. 5.1 Introduction. 5.2 Entropy
Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is
More informationAalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes
Aalborg Universitet Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Published in: 2004 International Seminar on Communications DOI link to publication
More informationSecret Key and Private Key Constructions for Simple Multiterminal Source Models
Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationEECS 750. Hypothesis Testing with Communication Constraints
EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More information