Information Theoretic Limits of Randomness Generation

Size: px
Start display at page:

Download "Information Theoretic Limits of Randomness Generation"

Transcription

1 Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016

2 Information theory The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. A Mathematical Theory of Communication, Shannon (1948) El Gamal (Stanford University) Shannon Centennial 2 / 44

3 Information theory Probabilistic models for the source and the channel El Gamal (Stanford University) Shannon Centennial 3 / 44

4 Information theory Probabilistic models for the source and the channel Entropy as measure of information El Gamal (Stanford University) Shannon Centennial 3 / 44

5 Information theory Probabilistic models for the source and the channel Entropy as measure of information Minimum rate for compression: entropy H El Gamal (Stanford University) Shannon Centennial 3 / 44

6 Information theory Probabilistic models for the source and the channel Entropy as measure of information Minimum rate for compression: entropy H Highest rate of reliable communication: capacity C = max mutual information I El Gamal (Stanford University) Shannon Centennial 3 / 44

7 Information theory Probabilistic models for the source and the channel Entropy as measure of information Minimum rate for compression: entropy H Highest rate of reliable communication: capacity C = max mutual information I Source can communicated over channel iff H C Bits as a universal interface between source and channel El Gamal (Stanford University) Shannon Centennial 3 / 44

8 Profound impact on practice Einstein looms large, and rightly so. But we er not living in the relativity age, we re living in the information age. It s Shannon whose fingerprints are on every electronic device we own, every computer screen we gaze into, every means of digital communication. He s one of these people who so transform the world that, after the transformation, the old world is forgotten. James Gleick in the NEW YORKER, April 30, 2016 El Gamal (Stanford University) Shannon Centennial 4 / 44

9 Information theory beyond communication El Gamal (Stanford University) Shannon Centennial 5 / 44

10 Information theory beyond communication INFORMATION theory has, in the last few years, become something of a scientific bandwagon. Starting as a technical tool for the communication engineer, it has received an extraordinary amount of publicity in the popular as well as the scientific press....as a consequence, it has perhaps been ballooned to an importance beyond its actual accomplishments.... Applications are being made to biology, psychology, linguistics, fundamental physics, economics, the theory of organization, and many others.... The Bandwagon, Shannon (1956) El Gamal (Stanford University) Shannon Centennial 5 / 44

11 Information theory beyond communication INFORMATION theory has, in the last few years, become something of a scientific bandwagon. Starting as a technical tool for the communication engineer, it has received an extraordinary amount of publicity in the popular as well as the scientific press....as a consequence, it has perhaps been ballooned to an importance beyond its actual accomplishments.... Applications are being made to biology, psychology, linguistics, fundamental physics, economics, the theory of organization, and many others....i personally believe that many of the concepts of information theory will prove useful in these other fields and, indeed, some results are already quite promising but the establishing of such applications is not a trivial matter of translating words to a new domain, but rather the slow tedious process of hypothesis and experimental verification.... The Bandwagon, Shannon (1956) El Gamal (Stanford University) Shannon Centennial 5 / 44

12 Information theory beyond communication Elements of Information Theory, Cover Thomas (1991) El Gamal (Stanford University) Shannon Centennial 5 / 44

13 Randomness generation Generating random variables with prescribed distribution from coin flips El Gamal (Stanford University) Shannon Centennial 6 / 44

14 Randomness generation Generating random variables with prescribed distribution from coin flips Important applications: Monte Carlo simulation Randomized algorithms Cryptography Fundamental questions: Notions and measures of common information Secrecy Classical and quantum channel synthesis El Gamal (Stanford University) Shannon Centennial 6 / 44

15 Randomness generation Generating random variables with prescribed distribution from coin flips Important applications: Monte Carlo simulation Randomized algorithms Cryptography Fundamental questions: Notions and measures of common information Secrecy Classical and quantum channel synthesis What is the minimum amount of coin flips needed? Entropy and mutual information arise naturally as limits/bounds Proofs use information theoretic techniques El Gamal (Stanford University) Shannon Centennial 6 / 44

16 Outline Centralized randomness generation: One shot using a variable number of coin flips Asymptotic using a fixed number of coin flips El Gamal (Stanford University) Shannon Centennial 7 / 44

17 Outline Centralized randomness generation: One shot using a variable number of coin flips Asymptotic using a fixed number of coin flips Distributed randomness generation: Asymptotic One shot recent results El Gamal (Stanford University) Shannon Centennial 7 / 44

18 Outline Centralized randomness generation: One shot using a variable number of coin flips Asymptotic using a fixed number of coin flips Distributed randomness generation: Asymptotic One shot recent results Channel synthesis (common randomness/communication tradeoff): Asymptotic One shot El Gamal (Stanford University) Shannon Centennial 7 / 44

19 One shot randomness generation Randomness source W 1,W 2,... Generator X W 1,W 2,... is a Bern(1/2) sequence; X p(x) Use prefix-free code to generate X (as in zero error compression) El Gamal (Stanford University) Shannon Centennial 8 / 44

20 One shot randomness generation Randomness source W 1,W 2,... Generator X W 1,W 2,... is a Bern(1/2) sequence; X p(x) Use prefix-free code to generate X (as in zero error compression) Example: generating X Bern(1/3): W 1 = 0 0 X = 0 W 1 = 1 W 2 = 0 W 2 = 1 10 X = X = 0 W 3 = 0 W 3 = 1 P{X = 1} = = 1 3 El Gamal (Stanford University) Shannon Centennial 8 / 44

21 One shot randomness generation Randomness source W 1,W 2,... Generator X W 1,W 2,... is a Bern(1/2) sequence; X p(x) Use prefix-free code to generate X (as in zero error compression) Let L be the number of W i bits used to generate X What is the minimum E(L)? El Gamal (Stanford University) Shannon Centennial 8 / 44

22 One shot randomness generation Randomness source W 1,W 2,... Generator X W 1,W 2,... is a Bern(1/2) sequence; X p(x) Use prefix-free code to generate X (as in zero error compression) Let L be the number of W i bits used to generate X What is the minimum E(L)? Knuth Yao (1976) showed that: H(X) mine(l) < H(X)+2 bits/symbol, where H(X) = E[ logp(x)] is the entropy of X Measure of randomness of X El Gamal (Stanford University) Shannon Centennial 8 / 44

23 One shot randomness generation Randomness source W 1,W 2,... Generator X Example: generating X Bern(1/3): W 1 = 0 0 X = 0 W 1 = 1 W 2 = 0 W 2 = 1 10 X = X = 0 W 3 = 0 W 3 = 1 E(L) = (2+E(L)) E(L) = 2, 4 H(1/3) = El Gamal (Stanford University) Shannon Centennial 9 / 44

24 Asymptotic randomness generation Randomness source W nr Generator X n W nr is i.i.d. Bern(1/2) nr-bit sequence; X p(x) El Gamal (Stanford University) Shannon Centennial 10 / 44

25 Asymptotic randomness generation Randomness source W nr Generator X n W nr is i.i.d. Bern(1/2) nr-bit sequence; X p(x) Wish to generate X n such that: p X n(xn ) n i=1 p X(x i ) El Gamal (Stanford University) Shannon Centennial 10 / 44

26 Asymptotic randomness generation Randomness source W nr Generator X n W nr is i.i.d. Bern(1/2) nr-bit sequence; X p(x) Wish to generate X n such that: p X n(xn ) n i=1 p X(x i ) More specifically, we want the total variation distance: d n = p X (xn ) p X (x i ) 0 as n x i=1 What is the minimum randomness generation rate R? n El Gamal (Stanford University) Shannon Centennial 10 / 44

27 Asymptotic randomness generation Randomness source W nr Generator X n W nr is i.i.d. Bern(1/2) nr-bit sequence; X p(x) Wish to generate X n such that: p X n(xn ) n i=1 p X(x i ) More specifically, we want the total variation distance: d n = p X (xn ) x n i=1 p X (x i ) 0 as n What is the minimum randomness generation rate R? Turns out minr = H(X) El Gamal (Stanford University) Shannon Centennial 10 / 44

28 Random coding generation scheme Randomness source W nr Generator X n Use random coding to show existence of codes (Han Verdu 1993): Randomly generate X n ( ), [1:2 nr ], X n ( ) n i=1 p X(x i ) The generated code is revealed to the generator X n El Gamal (Stanford University) Shannon Centennial 10 / 44

29 Random coding generation scheme Randomness source W nr Generator X n Use random coding to show existence of codes (Han Verdu 1993): Randomly generate X n ( ), [1:2 nr ], X n ( ) n i=1 p X(x i ) The generated code is revealed to the generator Upon observing W =, the generator outputs X n = X n ( ) El Gamal (Stanford University) Shannon Centennial 10 / 44

30 Random coding generation scheme Randomness source W nr Generator X n Use random coding to show existence of codes (Han Verdu 1993): Randomly generate X n ( ), [1:2 nr ], X n ( ) n i=1 p X(x i ) The generated code is revealed to the generator Upon observing W =, the generator outputs X n = X n ( ) Can show that: E(D n ) 0 if R > H(X) Hence, there exists a sequence of codes with d n 0 if R > H(X) This type of existence proof extends to distributed generation settings El Gamal (Stanford University) Shannon Centennial 10 / 44

31 Summary of centralized randomness generation One shot: Entropy is the limit same as for zero error compression Both use optimal prefix codes Asymptotic: Entropy is the limit same as for lossless compression Use random coding (key technique in information theory) Same limit for one shot and asymptotic a more relaxed condition El Gamal (Stanford University) Shannon Centennial 11 / 44

32 Outline Centralized randomness generation: One shot Asymptotic Distributed randomness generation: Asymptotic One shot recent results Channel synthesis (common randomness/communication tradeoff): Asymptotic One shot El Gamal (Stanford University) Shannon Centennial 12 / 44

33 Asymptotic distributed randomness generation Common randomness source W nr Alice Bob X n 1 X n 2 W nr i.i.d. Bern(1/2) sequence; (X 1,X 2 ) p(x 1,x 2 ) Alice and Bob each has unlimited independent local randomness El Gamal (Stanford University) Shannon Centennial 13 / 44

34 Asymptotic distributed randomness generation Common randomness source W nr Alice Bob X n 1 X n 2 W nr i.i.d. Bern(1/2) sequence; (X 1,X 2 ) p(x 1,x 2 ) Alice and Bob each has unlimited independent local randomness Alice generates X n 1 and Bob generates X n 2 such that: d n = p X n (x1 n,xn 2 ) 1, X 2 n (x n n 1,xn 2 ) p X1,X 2 (x 1i,x 2i ) 0 i=1 What is the minimum common randomness rate R? El Gamal (Stanford University) Shannon Centennial 13 / 44

35 Asymptotic distributed randomness generation Common randomness source W nr Alice Bob X n 1 X n 2 W nr i.i.d. Bern(1/2) sequence; (X 1,X 2 ) p(x 1,x 2 ) Alice and Bob each has unlimited independent local randomness Alice generates X n 1 and Bob generates X n 2 d n = (x n 1,xn 2 ) p X n 1, X n 2 such that: (x n n 1,xn 2 ) p X1,X 2 (x 1i,x 2i ) 0 i=1 What is the minimum common randomness rate R? Wyner (1975) showed that: minr = J(X 1 ;X 2 ) = min X 1 U X 2 I(X 1,X 2 ;U) Wyner common information El Gamal (Stanford University) Shannon Centennial 13 / 44

36 Asymptotic distributed randomness generation Common randomness source W nr Alice Bob X n 1 X n 2 Wyner (1975) showed that: minr = J(X 1 ;X 2 ) = min X 1 U X 2 I(X 1,X 2 ;U) Wyner common information Mutual information: I(X;Y) = H(X)+H(Y) H(X,Y) = H(X) H(X Y) = H(Y) H(Y X) Another measure of common information El Gamal (Stanford University) Shannon Centennial 14 / 44

37 Asymptotic distributed randomness generation Common randomness source W nr Alice Bob X n 1 X n 2 Wyner (1975) showed that: minr = J(X 1 ;X 2 ) = min X 1 U X 2 I(X 1,X 2 ;U) Wyner common information Mutual information: I(X;Y) = H(X)+H(Y) H(X,Y) = H(X) H(X Y) = H(Y) H(Y X) Another measure of common information It is not difficult to see that: I(X 1 ;X 2 ) J(X 1 ;X 2 ) min{h(x 1 ),H(X 2 )} El Gamal (Stanford University) Shannon Centennial 14 / 44

38 Wyner s generation scheme Show R > J(X 1 ;X 2 ) = min X1 U X 2 I(X 1,X 2 ;U) suffices El Gamal (Stanford University) Shannon Centennial 15 / 44

39 Wyner s generation scheme Let p(u)p(x 1 u)p(x 2 u) (p(x 1,x 2 ) given) attain J(X 1 ;X 2 ) Generate U n ( ), [1:2 nr ], each n i=1 p U(u i ) Codebook revealed to Alice and Bob before communication Source outputs W = Alice generates X n 1 n i=1 p X 1 U( x 1i u i ( )) Bob generates X n 2 n i=1 p X 2 U( x 1i u i ( )) U n El Gamal (Stanford University) Shannon Centennial 15 / 44

40 Wyner s generation scheme Let p(u)p(x 1 u)p(x 2 u) (p(x 1,x 2 ) given) attain J(X 1 ;X 2 ) Generate U n ( ), [1:2 nr ], each n i=1 p U(u i ) Codebook revealed to Alice and Bob before communication Source outputs W = Alice generates X n 1 n i=1 p X 1 U( x 1i u i ( )) Bob generates X n 2 n i=1 p X 2 U( x 1i u i ( )) U n Alice U ( ) Bob El Gamal (Stanford University) Shannon Centennial 15 / 44

41 Wyner s generation scheme Let p(u)p(x 1 u)p(x 2 u) (p(x 1,x 2 ) given) attain J(X 1 ;X 2 ) Generate U n ( ), [1:2 nr ], each n i=1 p U(u i ) Codebook revealed to Alice and Bob before communication Source outputs W = from which Alice and Bob recover U n ( ) Alice generates X n 1 n i=1 p X 1 U( x 1i u i ( )) Bob generates X n 2 n i=1 p X 2 U( x 1i u i ( )) U n Alice U n ( ) p(x 1 u) X n 1 U ( ) Bob U n ( ) p(x 2 u) X n 2 El Gamal (Stanford University) Shannon Centennial 15 / 44

42 Wyner s generation scheme Let p(u)p(x 1 u)p(x 2 u) (p(x 1,x 2 ) given) attain J(X 1 ;X 2 ) Generate U n ( ), [1:2 nr ], each n i=1 p U(u i ) Codebook revealed to Alice and Bob before communication Source outputs W = from which Alice and Bob recover U n ( ) Alice generates X n 1 n i=1 p X 1 U( x 1i u i ( )) Bob generates X n 2 n i=1 p X 2 U( x 1i u i ( )) Using asymptotic centralized generation scheme: R > H(U) suffices El Gamal (Stanford University) Shannon Centennial 15 / 44

43 Wyner s generation scheme Let p(u)p(x 1 u)p(x 2 u) (p(x 1,x 2 ) given) attain J(X 1 ;X 2 ) Generate U n ( ), [1:2 nr ], each n i=1 p U(u i ) Codebook revealed to Alice and Bob before communication Source outputs W = from which Alice and Bob recover U n ( ) Alice generates X n 1 n i=1 p X 1 U( x 1i u i ( )) Bob generates X n 2 n i=1 p X 2 U( x 1i u i ( )) Using asymptotic centralized generation scheme: R > H(U) suffices Wyner (1975) showed that only R > I(X 1,X 2 ;U) ( H(U)) suffices Proof uses a covering lemma similar to lossy compression El Gamal (Stanford University) Shannon Centennial 15 / 44

44 Wyner s generation scheme Let p(u)p(x 1 u)p(x 2 u) (p(x 1,x 2 ) given) attain J(X 1 ;X 2 ) Generate U n ( ), [1:2 nr ], each n i=1 p U(u i ) Codebook revealed to Alice and Bob before communication Source outputs W = from which Alice and Bob recover U n ( ) Alice generates X n 1 n i=1 p X 1 U( x 1i u i ( )) Bob generates X n 2 n i=1 p X 2 U( x 1i u i ( )) Using asymptotic centralized generation scheme: R > H(U) suffices Wyner (1975) showed that only R > I(X 1,X 2 ;U) ( H(U)) suffices Proof uses a covering lemma similar to lossy compression Converse (minr J(X 1 ;X 2 )) uses information theoretic inequalities El Gamal (Stanford University) Shannon Centennial 15 / 44

45 One shot distributed randomness generation Alice W 1,W 2,... W Generator Bob X 1 X 2 W 1,W 2,... i.i.d. Bern(1/2); (X 1,X 2 ) p(x 1,x 2 ) Generator outputs common randomness variable W using prefix code El Gamal (Stanford University) Shannon Centennial 16 / 44

46 One shot distributed randomness generation Alice W 1,W 2,... W Generator Bob X 1 X 2 W 1,W 2,... i.i.d. Bern(1/2); (X 1,X 2 ) p(x 1,x 2 ) Generator outputs common randomness variable W using prefix code Let L be number of W i bits used to generate W What is the minimum E(L)? El Gamal (Stanford University) Shannon Centennial 16 / 44

47 One shot distributed randomness generation Alice W 1,W 2,... W Generator Bob X 1 X 2 What is the minimum E(L)? By Knuth Yao (1976) can show: G(X 1 ;X 2 ) mine(l) < G(X 1 ;X 2 )+2, where G(X 1 ;X 2 ) = min X 1 W X 2 H(W) common entropy (Kumar Li EG 2014) Another measure of common information (in addition to I and J) El Gamal (Stanford University) Shannon Centennial 16 / 44

48 One shot distributed randomness generation Alice W 1,W 2,... W Generator Bob X 1 X 2 What is the minimum E(L)? By Knuth Yao (1976) can show: G(X 1 ;X 2 ) mine(l) < G(X 1 ;X 2 )+2, where G(X 1 ;X 2 ) = min X 1 W X 2 H(W) common entropy (Kumar Li EG 2014) Another measure of common information (in addition to I and J) G(X 1 ;X 2 ) is very difficult to compute except for simple cases El Gamal (Stanford University) Shannon Centennial 16 / 44

49 (X 1,X 2 ) discrete W 1,W 2,... Generator W Alice X 1 Bob X 2 I(X 1 ;X 2 ) J(X 1 ;X 2 ) G(X 1 ;X 2 ) min{h(x 1 ),H(X 2 )} I(X 1 ;X 2 ): mutual information J(X 1 ;X 2 ) = min X1 U X 2 I(X 1,X 2 ;U): Wyner common information G(X 1 ;X 2 ) = min X1 W X 2 H(W): common entropy All inequalities can be strict El Gamal (Stanford University) Shannon Centennial 17 / 44

50 Example: (X 1,X 2 ) DSBS 1 1 p 1 X 1 Bern(1/2) X p 0 El Gamal (Stanford University) Shannon Centennial 18 / 44

51 Example: (X 1,X 2 ) DSBS G(X G(X;Y) 1 ;X 2 ) 0.2 I(X I(X;Y) 1 ;X 2 ) J(X J(X;Y) 1 ;X 2 ) p p G(X 1 ;X 2 ) = H((1/2 p)/(1 p)) for p 1/2 J(X 1 ;X 2 ) = 1+H(p) 2H(α), where 2α(1 α) = p I(X 1 ;X 2 ) = 1 H(p); H(X 1 ) = H(X 2 ) = 1 El Gamal (Stanford University) Shannon Centennial 18 / 44

52 (X 1,X 2 ) continuous (Li EG 2016) W 1,W 2,... Generator W Alice Bob X 1 X 2 We still have: I(X 1 ;X 2 ) J(X 1 ;X 2 ) G(X 1 ;X 2 ) No known general upper bound on G or J (H(X 1 ) = H(X 2 ) = ) El Gamal (Stanford University) Shannon Centennial 19 / 44

53 (X 1,X 2 ) continuous (Li EG 2016) W 1,W 2,... Generator W Alice Bob X 1 X 2 We still have: I(X 1 ;X 2 ) J(X 1 ;X 2 ) G(X 1 ;X 2 ) No known general upper bound on G or J (H(X 1 ) = H(X 2 ) = ) Example: (X 1,X 2 ) Gaussian with σ 1 = σ 2 = 1, 0 ρ < 1 I(X 1 ;X 2 ) = 1 2 log 1 1 ρ 2 J(X 1 ;X 2 ) = 1 2 log 1+ ρ 1 ρ, U N(0, ρ), X j =U +Z j, Z j N(0,1 ρ) Is G(X 1 ;X 2 ) also finite? El Gamal (Stanford University) Shannon Centennial 19 / 44

54 Upper bound on G For (X 1,X 2 ) with log-concave pdf: Theorem (Li EG 2016) I(X 1 ;X 2 ) J(X 1 ;X 2 ) G(X 1 ;X 2 ) I(X 1 ;X 2 )+24 El Gamal (Stanford University) Shannon Centennial 20 / 44

55 Upper bound on G For (X 1,X 2 ) with log-concave pdf: Theorem (Li EG 2016) I(X 1 ;X 2 ) J(X 1 ;X 2 ) G(X 1 ;X 2 ) I(X 1 ;X 2 )+24 Result extends to generation of n continuous random variables El Gamal (Stanford University) Shannon Centennial 20 / 44

56 Outline of proof (X 1,X 2 ) uniformly distributed over a set in R 2 : Construct W using dyadic decomposition scheme Bound H(W) by I(X 1 ;X 2 )+const. for S convex using erosion entropy Extend proof to general pdf: Apply scheme to uniform over positive hypograph of pdf Bound G(X 1 ;X 2 ) by I(X 1 ;X 2 )+24 for log-concave pdf El Gamal (Stanford University) Shannon Centennial 21 / 44

57 (X 1,X 2 ) Unif(S) Dyadic square: k Z, Z 2 2 k 2 k El Gamal (Stanford University) Shannon Centennial 22 / 44

58 (X 1,X 2 ) Unif(S) Dyadic square: k Z, Z 2 2 k 2 k Dyadic decomposition: Partition S into largest possible dyadic squares El Gamal (Stanford University) Shannon Centennial 22 / 44

59 Dyadic decomposition for an ellipse k < 1: no dyadic squares inside S El Gamal (Stanford University) Shannon Centennial 23 / 44

60 Dyadic decomposition for an ellipse k = El Gamal (Stanford University) Shannon Centennial 23 / 44

61 Dyadic decomposition for an ellipse k = El Gamal (Stanford University) Shannon Centennial 23 / 44

62 Dyadic decomposition for an ellipse k = El Gamal (Stanford University) Shannon Centennial 23 / 44

63 Constructing W from dyadic decomposition W D is the square containing (X 1,X 2 ) Conditioned on W D, X 1 and X 2 are uniform over dyadic square Hence, X 1 W D X 2 (X 1,X 2 ) El Gamal (Stanford University) Shannon Centennial 24 / 44

64 Constructing W from dyadic decomposition W D is the square containing (X 1,X 2 ) Conditioned on W D, X 1 and X 2 are uniform over dyadic square Hence, X 1 W D X 2 (X 1,X 2 ) And, H(W D ) G(X 1 ;X 2 ) = min X1 W X 2 H(W) El Gamal (Stanford University) Shannon Centennial 24 / 44

65 Generating W D from coin flips Generate W D using a prefix code Dyadic decomposition and code revealed to all parties El Gamal (Stanford University) Shannon Centennial 25 / 44

66 Generation scheme for (X 1,X 2 ) Unif(S) 1100 W D = El Gamal (Stanford University) Shannon Centennial 26 / 44

67 Generation scheme for (X 1,X 2 ) Unif(S) 1100 W D = El Gamal (Stanford University) Shannon Centennial 26 / 44

68 Generation scheme for (X 1,X 2 ) Unif(S) 1100 W D = X 2 X 1 El Gamal (Stanford University) Shannon Centennial 26 / 44

69 (X 1,X 2 ) uniform over ellipse H(W ( ) D ) I(X1;X2) 1 2 ) I(X 1 ;X 2 ) Gap between I and G of 6 bits (vs. 24 for general log-concave pdf) El Gamal (Stanford University) Shannon Centennial 27 / 44

70 Bounding H(W D ) H(W D ) = E log(l 2 D /A(S)), A(S) is area of S L D : side length of largest dyadic square (X 1,X 2 ) in (1/2)(H(W D ) loga(s)) = E logl D El Gamal (Stanford University) Shannon Centennial 28 / 44

71 Bounding H(W D ) H(W D ) = E log(l 2 D /A(S)), A(S) is area of S L D : side length of largest dyadic square (X 1,X 2 ) in (1/2)(H(W D ) loga(s)) = E logl D Erosion entropy: h (S) = E logl C L C : side length of largest square centered at (X 1,X 2 ) El Gamal (Stanford University) Shannon Centennial 28 / 44

72 Bounding H(W D ) H(W D ) = E log(l 2 D /A(S)), A(S) is area of S L D : side length of largest dyadic square (X 1,X 2 ) in (1/2)(H(W D ) loga(s)) = E logl D Erosion entropy: h (S) = E logl C L C : side length of largest square centered at (X 1,X 2 ) Lemma (H(W D) loga(s)) h (S)+2 El Gamal (Stanford University) Shannon Centennial 28 / 44

73 Bounding erosion entropy for orthogonally convex set Suppose S is orthogonally convex, i.e., intersection with each axis-aligned line is connected L 2 (S) A(S): area of S Then it can be shown that: Lemma 2 L 1 (S) h = E[ logl C ] log L 1(S)+L 2 (S) +loge A(S) El Gamal (Stanford University) Shannon Centennial 29 / 44

74 Bounding H(W D ) Substituting from Lemma 2 into Lemma 1, we have: H(W D ) log (L 1(S)+L 2 (S)) loge A(S) Depends on perimeter to area ratio of S El Gamal (Stanford University) Shannon Centennial 30 / 44

75 Bounding H(W D ) Substituting from Lemma 2 into Lemma 1, we have: H(W D ) log (L 1(S)+L 2 (S)) loge A(S) Depends on perimeter to area ratio of S A flat S has high perimeter to area ratio high H(W D ) El Gamal (Stanford University) Shannon Centennial 30 / 44

76 Bounding H(W D ) Substituting from Lemma 2 into Lemma 1, we have: H(W D ) log (L 1(S)+L 2 (S)) loge A(S) Depends on perimeter to area ratio of S A flat S has high perimeter to area ratio high H(W D ) Pre-scale (X 1,X 2 ) to ( X 1, X 2 ) Unif( S) such that L 1 ( S) = L 2 ( S) El Gamal (Stanford University) Shannon Centennial 30 / 44

77 Bounding H(W D ) Substituting from Lemma 2 into Lemma 1, we have: H(W D ) log (L 1(S)+L 2 (S)) loge A(S) Depends on perimeter to area ratio of S A flat S has high perimeter to area ratio high H(W D ) Pre-scale (X 1,X 2 ) to ( X 1, X 2 ) Unif( S) such that L 1 ( S) = L 2 ( S) This yields: Proposition H( W D ) log 4L 1(S) L 2 (S) +4+2loge A(S) El Gamal (Stanford University) Shannon Centennial 30 / 44

78 Bounding H( W D ) by I(X 1 ;X 2 ) for S convex We have H( W D ) log L 1(S) L 2 (S) +6+2loge A(S) El Gamal (Stanford University) Shannon Centennial 31 / 44

79 Bounding H( W D ) by I(X 1 ;X 2 ) for S convex We have Now H( W D ) log L 1(S) L 2 (S) +6+2loge A(S) I(X 1 ;X 2 ) = h(x 1 )+h(x 2 ) loga(s) El Gamal (Stanford University) Shannon Centennial 31 / 44

80 Bounding H( W D ) by I(X 1 ;X 2 ) for S convex We have H( W D ) log L 1(S) L 2 (S) +6+2loge A(S) Now I(X 1 ;X 2 ) = h(x 1 )+h(x 2 ) loga(s) Combining, we have H( W D ) I(X 1 ;X 2 )+(logl 1 (S) h(x 1 ))+(logl 2 (S) h(x 2 ))+6+2loge El Gamal (Stanford University) Shannon Centennial 31 / 44

81 Bounding H( W D ) by I(X 1 ;X 2 ) for S convex We have H( W D ) log L 1(S) L 2 (S) +6+2loge A(S) Now I(X 1 ;X 2 ) = h(x 1 )+h(x 2 ) loga(s) Combining, we have H( W D ) I(X 1 ;X 2 )+(logl 1 (S) h(x 1 ))+(logl 2 (S) h(x 2 ))+6+2loge If S is convex, marginals close to uniform And, we obtain constant gap between H( W D ) and I(X 1 ;X 2 ) El Gamal (Stanford University) Shannon Centennial 31 / 44

82 Generation scheme for (X 1,X 2 ) f Consider positive part of hypograph of the pdf S = {(x 1,x 2,z) : 0 z f(x 1,x 2 )} R 3 El Gamal (Stanford University) Shannon Centennial 32 / 44

83 Generation scheme for (X 1,X 2 ) f Consider positive part of hypograph of the pdf S = {(x 1,x 2,z) : 0 z f(x 1,x 2 )} R 3 Key observation: If we let (X 1,X 2,Z) Unif(S), then (X 1,X 2 ) f El Gamal (Stanford University) Shannon Centennial 32 / 44

84 Generation scheme for (X 1,X 2 ) f Consider positive part of hypograph of the pdf S = {(x 1,x 2,z) : 0 z f(x 1,x 2 )} R 3 Key observation: If we let (X 1,X 2,Z) Unif(S), then (X 1,X 2 ) f Apply dyadic decomposition scheme to uniform over S R 3 El Gamal (Stanford University) Shannon Centennial 32 / 44

85 Bound on G for log-concave pdf f log-concave positive part of hypograph orthogonally convex Using erosion entropy, scaling and additional nontrivial steps, we obtain: Theorem I(X 1 ;X 2 ) J(X 1 ;X 2 ) G(X 1 ;X 2 ) I(X 1 ;X 2 )+24 El Gamal (Stanford University) Shannon Centennial 33 / 44

86 Bound on G for log-concave pdf f log-concave positive part of hypograph orthogonally convex Using erosion entropy, scaling and additional nontrivial steps, we obtain: Theorem I(X 1 ;X 2 ) J(X 1 ;X 2 ) G(X 1 ;X 2 ) I(X 1 ;X 2 )+24 Result can be extended to n agents: Theorem I D J(X 1 ;...;X n ) G(X 1 ;...;X n ) I D +n 2 loge +9nlogn I D (X 1 ;...;X n ) = h(x 1,...,X n ) n i=1 h(x i X i 1,X n i+1 ) is the Dual total correlation El Gamal (Stanford University) Shannon Centennial 33 / 44

87 Summary of distributed randomness generation Asymptotic distributed randomness generation: Wyner common information J is the limit Use random coding and similar covering lemma to lossy compression El Gamal (Stanford University) Shannon Centennial 34 / 44

88 Summary of distributed randomness generation Asymptotic distributed randomness generation: Wyner common information J is the limit Use random coding and similar covering lemma to lossy compression One shot distributed randomness generation: Common entropy is the limit Common entropy within constant of mutual information for log concave pdfs El Gamal (Stanford University) Shannon Centennial 34 / 44

89 Summary of distributed randomness generation Asymptotic distributed randomness generation: Wyner common information J is the limit Use random coding and similar covering lemma to lossy compression One shot distributed randomness generation: Common entropy is the limit Common entropy within constant of mutual information for log concave pdfs Limit can be larger for one shot than asymptotic (G > J) Common information has several different measures (I, J, G) El Gamal (Stanford University) Shannon Centennial 34 / 44

90 Outline Centralized randomness generation: One shot Asymptotic Distributed randomness generation: Asymptotic One shot recent results Channel synthesis (common randomness/communication tradeoff): Asymptotic One shot El Gamal (Stanford University) Shannon Centennial 35 / 44

91 Asymptotic channel simulation with common randomness Common randomness source W X n M [1 : 2 nr ] Alice Bob Ŷ n (X,Y) p(x, y); Ŷ = Y; X n n i=1 p X(x i ) W: common randomness Bob wishes to generate Ŷ n such that: d n = (x n,y n ) n i=1 p X(x i )pŷn X n(yn x n ) n i=1 p X,Y(x i, y i ) 0 What is the minimum communication rate R? El Gamal (Stanford University) Shannon Centennial 36 / 44

92 Asymptotic channel simulation with common randomness Common randomness source W X n M [1 : 2 nr ] Alice Bob Ŷ n (X,Y) p(x, y); Ŷ = Y; X n n i=1 p X(x i ) W: common randomness Bob wishes to generate Ŷ n such that: d n = (x n,y n ) n i=1 p X(x i )pŷn X n(yn x n ) n i=1 p X,Y(x i, y i ) 0 What is the minimum communication rate R? Bennett Shor Smolin Thapliyal (2002) showed that: min R = I(X; Y) El Gamal (Stanford University) Shannon Centennial 36 / 44

93 Asymptotic channel simulation with common randomness Common randomness source W X n M [1 : 2 nr ] Alice Bob Ŷ n For X n arbitrary: minr = max p(x) I(X;Y) =C, capacity of channel p(y x) El Gamal (Stanford University) Shannon Centennial 36 / 44

94 Asymptotic channel simulation with common randomness Common randomness source W X n M [1 : 2 nr ] Alice Bob Ŷ n For X n arbitrary: minr = max p(x) I(X;Y) =C, capacity of channel p(y x) Reverse Shannon theorem: Noiseless channel with capacity C and unlimited common randomness can simulate any noisy channel with capacity C El Gamal (Stanford University) Shannon Centennial 36 / 44

95 Asymptotic channel simulation with common randomness Common randomness source W X n M [1 : 2 nr ] Alice Bob Ŷ n For X n arbitrary: minr = max p(x) I(X;Y) =C, capacity of channel p(y x) Reverse Shannon theorem: Noiseless channel with capacity C and unlimited common randomness can simulate any noisy channel with capacity C Shannon theorem: Noiseless channel with capacity C can be simulated by any noisy channel with capacity C El Gamal (Stanford University) Shannon Centennial 36 / 44

96 Asymptotic channel simulation with common randomness Common randomness source W X n M [1 : 2 nr ] Alice Bob Ŷ n For X n arbitrary: minr = max p(x) I(X;Y) =C, capacity of channel p(y x) Reverse Shannon theorem: Noiseless channel with capacity C and unlimited common randomness can simulate any noisy channel with capacity C Shannon theorem: Noiseless channel with capacity C can be simulated by any noisy channel with capacity C Hence any channel p can simulate a channel q iff C(p) C(q) El Gamal (Stanford University) Shannon Centennial 36 / 44

97 Asymptotic channel simulation with common randomness Common randomness source W X n M [1 : 2 nr ] Alice Bob Ŷ n For X n arbitrary: minr = max p(x) I(X;Y) =C, capacity of channel p(y x) Reverse Shannon theorem: Noiseless channel with capacity C and unlimited common randomness can simulate any noisy channel with capacity C Shannon theorem: Noiseless channel with capacity C can be simulated by any noisy channel with capacity C Hence any channel p can simulate a channel q iff C(p) C(q) Does the same hold for entanglement-assisted quantum channels? El Gamal (Stanford University) Shannon Centennial 36 / 44

98 Channel simulation without common randomness X n M [1 : 2 nr ] Alice Bob Ŷ n (X,Y) p(x, y); X n n i=1 p X(x i ) Bob wishes to generate Ŷ n such that: d n = (x n,y n ) n i=1 p X(x i )pŷn X n(yn x n ) n i=1 p X,Y(x i, y i ) 0 What is the minimum communication rate R? El Gamal (Stanford University) Shannon Centennial 37 / 44

99 Channel simulation without common randomness X n M [1 : 2 nr ] Alice Bob Ŷ n (X,Y) p(x, y); X n n i=1 p X(x i ) Bob wishes to generate Ŷ n such that: d n = (x n,y n ) n i=1 p X(x i )pŷn X n(yn x n ) n i=1 p X,Y(x i, y i ) 0 What is the minimum communication rate R? Cuff (2013) showed that: minr = J(X;Y) El Gamal (Stanford University) Shannon Centennial 37 / 44

100 Channel simulation without common randomness X n M [1 : 2 nr ] Alice Bob Ŷ n (X,Y) p(x, y); X n n i=1 p X(x i ) Bob wishes to generate Ŷ n such that: d n = (x n,y n ) n i=1 p X(x i )pŷn X n(yn x n ) n i=1 p X,Y(x i, y i ) 0 What is the minimum communication rate R? Cuff (2013) showed that: minr = J(X;Y) Operationally shows that: J(X;Y) I(X,Y) El Gamal (Stanford University) Shannon Centennial 37 / 44

101 Channel simulation without common randomness X n M [1 : 2 nr ] Alice Bob Ŷ n (X,Y) p(x, y); X n n i=1 p X(x i ) Bob wishes to generate Ŷ n such that: d n = (x n,y n ) n i=1 p X(x i )pŷn X n(yn x n ) n i=1 p X,Y(x i, y i ) 0 What is the minimum communication rate R? Cuff (2013) showed that: minr = J(X;Y) Operationally shows that: J(X;Y) I(X,Y) For X = Y, problem reduces to lossless compression with limit H(X) El Gamal (Stanford University) Shannon Centennial 37 / 44

102 Tradeoff between communication and common randomness Suppose W [1 : 2 nr C ] (still have M [1 : 2 nr ]) El Gamal (Stanford University) Shannon Centennial 38 / 44

103 Tradeoff between communication and common randomness Suppose W [1 : 2 nr C ] (still have M [1 : 2 nr ]) Cuff (2013), Bennett Devetak Harrow Shor Winter (2014) established optimal tradeoff between R and common randomness rate R C R C Set of (R,R C ) such that: R I(X;U), R C +R I(X,Y;U) for some X U Y I(X; Y) J(X; Y) R El Gamal (Stanford University) Shannon Centennial 38 / 44

104 Tradeoff between communication and common randomness Suppose W [1 : 2 nr C ] (still have M [1 : 2 nr ]) Cuff (2013), Bennett Devetak Harrow Shor Winter (2014) established optimal tradeoff between R and common randomness rate R C R C Set of (R,R C ) such that: R I(X;U), R C +R I(X,Y;U) for some X U Y I(X; Y) J(X; Y) R Application to information theoretic secrecy (Cuff 2013) El Gamal (Stanford University) Shannon Centennial 38 / 44

105 One shot channel simulation with common randomness Common randomness source W X Alice M {0,1} Bob Y (X,Y) p(x, y); L length of prefix code for M Bob wishes to generate Y p Y X (y x) What is the minimum E(L)? El Gamal (Stanford University) Shannon Centennial 39 / 44

106 One shot channel simulation with common randomness Common randomness source W X Alice M {0,1} Bob Y (X,Y) p(x, y); L length of prefix code for M Bob wishes to generate Y p Y X (y x) What is the minimum E(L)? Harsha Jain McAllester Radhakrishnan (2010) showed that: I(X;Y) mine(l) I(X;Y)+2log(I(X;Y)+1)+O(1) El Gamal (Stanford University) Shannon Centennial 39 / 44

107 One shot channel simulation without common randomness X M {0,1} Y Alice Bob By Kumar Li EG (2014): G(X;Y) mine(l) < G(X;Y)+1 For X = Y, problem reduces to zero error compression with limit H(X) El Gamal (Stanford University) Shannon Centennial 40 / 44

108 One shot channel simulation without common randomness X M {0,1} Y Alice Bob By Kumar Li EG (2014): G(X;Y) mine(l) < G(X;Y)+1 For X = Y, problem reduces to zero error compression with limit H(X) Optimal tradeoff between R and R C is not known R C? I(X;Y) G(X;Y) R El Gamal (Stanford University) Shannon Centennial 40 / 44

109 Conclusion Limits of randomness generation and communication closely related El Gamal (Stanford University) Shannon Centennial 41 / 44

110 Conclusion Limits of randomness generation and communication closely related Entropy is the limit on compression and randomness generation El Gamal (Stanford University) Shannon Centennial 41 / 44

111 Conclusion Limits of randomness generation and communication closely related Entropy is the limit on compression and randomness generation Mutual information is the limit on: Channel capacity and lossy compression Channel simulation with common randomness El Gamal (Stanford University) Shannon Centennial 41 / 44

112 Conclusion Limits of randomness generation and communication closely related Entropy is the limit on compression and randomness generation Mutual information is the limit on: Channel capacity and lossy compression Channel simulation with common randomness Wyner common information is the limit on: Asymptotic distributed randomness generation Asymptotic channel simulation without common randomness El Gamal (Stanford University) Shannon Centennial 41 / 44

113 Conclusion Limits of randomness generation and communication closely related Entropy is the limit on compression and randomness generation Mutual information is the limit on: Channel capacity and lossy compression Channel simulation with common randomness Wyner common information is the limit on: Asymptotic distributed randomness generation Asymptotic channel simulation without common randomness Common entropy is the limit on: One shot distributed randomness generation One shot channel simulation without common randomness El Gamal (Stanford University) Shannon Centennial 41 / 44

114 Conclusion Limits of randomness generation and communication closely related Entropy is the limit on compression and randomness generation Mutual information is the limit on: Channel capacity and lossy compression Channel simulation with common randomness Wyner common information is the limit on: Asymptotic distributed randomness generation Asymptotic channel simulation without common randomness Common entropy is the limit on: One shot distributed randomness generation One shot channel simulation without common randomness Natural application of information theoretic ideas and techniques El Gamal (Stanford University) Shannon Centennial 41 / 44

115 Thank you!

116 References Bennett, C. H., Devetak, I., Harrow, A. W., Shor, P. W., and Winter, A. (2014). The quantum reverse shannon theorem and resource tradeoffs for simulating quantum channels. IEEE Trans. Info. Theory, 60(5), Bennett, C. H., Shor, P. W., Smolin, J., and Thapliyal, A. V. (2002). Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem. IEEE Trans. Info. Theory, 48(10), Cover, T. M. (1972). Broadcast channels. IEEE Trans. Inf. Theory, 18(1), Cover, T. M. and Thomas, J. A. (1991). Elements of Information Theory. Wiley, New York. Cuff, P. (2013). Distributed channel synthesis. IEEE Trans. Info. Theory, 59(11), El Gamal, A. and Kim, Y.-H. (2011). Network Information Theory. Cambridge, Cambridge. Han, T. S. and Verdu, S. (1993). Approximation theory of output statistics. IEEE Trans. Info. Theory, 39(3), Harsha, P., Jain, R., McAllester, D., and Radhakrishnan, J. (2010). The communication complexity of correlation. IEEE Trans. Info. Theory, 56(1), El Gamal (Stanford University) Shannon Centennial 43 / 44

117 References (cont.) Knuth, D. E. and Yao, A. C. (1976). The complexity of random number generation. In Algorithms and Complexity (Proc. Symp., Carnegie Mellon Univ., Pittsburgh, Pa., 1976), pp Academic Press, New York. Kumar, G., Li, C. T., and EG, A. (2014). Exact common information. In Information Theory (ISIT), 2014 IEEE International Symposium on, pp Li, C. T. and El Gamal(2016). Distributed simulation of continuous random variables. In Proc. IEEE Symp. Info. Theory, pp Shannon, C. E. (1948). A mathematical theory of communication. Bell Syst. Tech. J., 27(3), , 27(4), Shannon, C. E. (1956). The zero error capacity of a noisy channel. IRE Trans. Inf. Theory, 2(3), Shannon, C. E. (1961). Two-way communication channels. In Proc. 4th Berkeley Symp. Math. Statist. Probab., vol. I, pp University of California Press, Berkeley. Slepian, D. and Wolf, J. K. (1973). Noiseless coding of correlated information sources. IEEE Trans. Inf. Theory, 19(4), Wyner, A. D. (1975). The common information of two dependent random variables. IEEE Trans. Inf. Theory, 21(2), El Gamal (Stanford University) Shannon Centennial 44 / 44

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

Extended Gray Wyner System with Complementary Causal Side Information

Extended Gray Wyner System with Complementary Causal Side Information Extended Gray Wyner System with Complementary Causal Side Information Cheuk Ting Li and Abbas El Gamal Department of Electrical Engineering, Stanford University Email: ctli@stanford.edu, abbas@ee.stanford.edu

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

Communication Cost of Distributed Computing

Communication Cost of Distributed Computing Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation

More information

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Quantum Information Theory and Cryptography

Quantum Information Theory and Cryptography Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

Quantum rate distortion, reverse Shannon theorems, and source-channel separation Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, Mark Wilde (1) University of Cambridge,U.K. (2) McGill University, Montreal, Canada Classical

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information 1 Conditional entropy Let (Ω, F, P) be a probability space, let X be a RV taking values in some finite set A. In this lecture

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

THIS paper is motivated by the following question. Alice

THIS paper is motivated by the following question. Alice IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 63, NO, OCTOBER 27 6329 Distributed Simulation of Continuous Random Variables CheukTingLi,Student Member, IEEE, and Abbas El Gamal, Fellow, IEEE Abstract We

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Interactive Hypothesis Testing with Communication Constraints

Interactive Hypothesis Testing with Communication Constraints Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical

More information

Remote Source Coding with Two-Sided Information

Remote Source Coding with Two-Sided Information Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5 UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).

More information

3F1 Information Theory, Lecture 1

3F1 Information Theory, Lecture 1 3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:

More information

2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising

More information

Information-theoretic Secrecy A Cryptographic Perspective

Information-theoretic Secrecy A Cryptographic Perspective Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Degradable Quantum Channels

Degradable Quantum Channels Degradable Quantum Channels Li Yu Department of Physics, Carnegie-Mellon University, Pittsburgh, PA May 27, 2010 References The capacity of a quantum channel for simultaneous transmission of classical

More information

Shannon s A Mathematical Theory of Communication

Shannon s A Mathematical Theory of Communication Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

On the Complete Monotonicity of Heat Equation

On the Complete Monotonicity of Heat Equation [Pop-up Salon, Maths, SJTU, 2019/01/10] On the Complete Monotonicity of Heat Equation Fan Cheng John Hopcroft Center Computer Science and Engineering Shanghai Jiao Tong University chengfan@sjtu.edu.cn

More information

Capacity of a Class of Semi-Deterministic Primitive Relay Channels

Capacity of a Class of Semi-Deterministic Primitive Relay Channels Capacity of a Class of Semi-Deterministic Primitive Relay Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 2742 ravit@umd.edu

More information

A tool oriented approach to network capacity. Ralf Koetter Michelle Effros Muriel Medard

A tool oriented approach to network capacity. Ralf Koetter Michelle Effros Muriel Medard A tool oriented approach to network capacity Ralf Koetter Michelle Effros Muriel Medard ralf.koetter@tum.de effros@caltech.edu medard@mit.edu The main theorem of NC [Ahlswede, Cai, Li Young, 2001] Links

More information

Example: Letter Frequencies

Example: Letter Frequencies Example: Letter Frequencies i a i p i 1 a 0.0575 2 b 0.0128 3 c 0.0263 4 d 0.0285 5 e 0.0913 6 f 0.0173 7 g 0.0133 8 h 0.0313 9 i 0.0599 10 j 0.0006 11 k 0.0084 12 l 0.0335 13 m 0.0235 14 n 0.0596 15 o

More information

Classical data compression with quantum side information

Classical data compression with quantum side information Classical data compression with quantum side information I Devetak IBM TJ Watson Research Center, Yorktown Heights, NY 10598, USA arxiv:quant-ph/0209029v3 18 Apr 2003 A Winter Department of Computer Science,

More information

Example: Letter Frequencies

Example: Letter Frequencies Example: Letter Frequencies i a i p i 1 a 0.0575 2 b 0.0128 3 c 0.0263 4 d 0.0285 5 e 0.0913 6 f 0.0173 7 g 0.0133 8 h 0.0313 9 i 0.0599 10 j 0.0006 11 k 0.0084 12 l 0.0335 13 m 0.0235 14 n 0.0596 15 o

More information

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2 Introduction to Information Theory B. Škorić, Physical Aspects of Digital Security, Chapter 2 1 Information theory What is it? - formal way of counting information bits Why do we need it? - often used

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Distributed Simulation of Continuous Random Variables

Distributed Simulation of Continuous Random Variables Distributed Simulation of Continuous Random Variables Cheuk Ting Li and Abbas El Gamal Department of Electrical Engineering, Stanford University Email: ctli@stanford.edu, abbas@ee.stanford.edu arxiv:60.05875v4

More information

The Capacity Region of a Class of Discrete Degraded Interference Channels

The Capacity Region of a Class of Discrete Degraded Interference Channels The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu

More information

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course? Information Theory and Coding Techniques: Chapter 1.1 What is Information Theory? Why you should take this course? 1 What is Information Theory? Information Theory answers two fundamental questions in

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance

More information

(Preprint of paper to appear in Proc Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov , 1990.)

(Preprint of paper to appear in Proc Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov , 1990.) (Preprint of paper to appear in Proc. 1990 Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, ov. 27-30, 1990.) CAUSALITY, FEEDBACK AD DIRECTED IFORMATIO James L. Massey Institute for Signal

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Graph Coloring and Conditional Graph Entropy

Graph Coloring and Conditional Graph Entropy Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Towards a Theory of Information Flow in the Finitary Process Soup

Towards a Theory of Information Flow in the Finitary Process Soup Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010 Goals Analyze model of evolutionary self-organization

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California - Santa Barbara {kumar,eakyol,rose}@ece.ucsb.edu

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

On Large Deviation Analysis of Sampling from Typical Sets

On Large Deviation Analysis of Sampling from Typical Sets Communications and Signal Processing Laboratory (CSPL) Technical Report No. 374, University of Michigan at Ann Arbor, July 25, 2006. On Large Deviation Analysis of Sampling from Typical Sets Dinesh Krithivasan

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

Example: Letter Frequencies

Example: Letter Frequencies Example: Letter Frequencies i a i p i 1 a 0.0575 2 b 0.0128 3 c 0.0263 4 d 0.0285 5 e 0.0913 6 f 0.0173 7 g 0.0133 8 h 0.0313 9 i 0.0599 10 j 0.0006 11 k 0.0084 12 l 0.0335 13 m 0.0235 14 n 0.0596 15 o

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

Quantum Achievability Proof via Collision Relative Entropy

Quantum Achievability Proof via Collision Relative Entropy Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

The Role of Directed Information in Network Capacity

The Role of Directed Information in Network Capacity The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Lecture: Quantum Information

Lecture: Quantum Information Lecture: Quantum Information Transcribed by: Crystal Noel and Da An (Chi Chi) November 10, 016 1 Final Proect Information Find an issue related to class you are interested in and either: read some papers

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Nonstochastic Information Theory for Feedback Control

Nonstochastic Information Theory for Feedback Control Nonstochastic Information Theory for Feedback Control Girish Nair Department of Electrical and Electronic Engineering University of Melbourne Australia Dutch Institute of Systems and Control Summer School

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Quantum Hadamard channels (II)

Quantum Hadamard channels (II) Quantum Hadamard channels (II) Vlad Gheorghiu Department of Physics Carnegie Mellon University Pittsburgh, PA 15213, USA August 5, 2010 Vlad Gheorghiu (CMU) Quantum Hadamard channels (II) August 5, 2010

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,

More information

Fundamental rate-loss tradeoff for optical quantum key distribution

Fundamental rate-loss tradeoff for optical quantum key distribution Fundamental rate-loss tradeoff for optical quantum key distribution Masahiro Takeoka (NICT) Saikat Guha (BBN) Mark M. Wilde (LSU) Quantum Krispy Kreme Seminar @LSU January 30, 2015 Outline Motivation Main

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

to mere bit flips) may affect the transmission.

to mere bit flips) may affect the transmission. 5 VII. QUANTUM INFORMATION THEORY to mere bit flips) may affect the transmission. A. Introduction B. A few bits of classical information theory Information theory has developed over the past five or six

More information