On Third-Order Asymptotics for DMCs

Size: px
Start display at page:

Download "On Third-Order Asymptotics for DMCs"

Transcription

1 On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

2 Acknowledgements This is joint work with Marco Tomamichel Centre for Quantum Technologies National University of Singapore Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop 013 / 9

3 Transmission of Information Shannon s Figure 1 INFORMATION SOURCE TRANSMITTER RECEIVER DESTINATION SIGNAL RECEIVED SIGNAL MESSAGE MESSAGE NOISE SOURCE Shannon abstracted away information meaning, semantics treat all data equally Shannon s bits as a universal Figure currency 1 crucial abstraction for modern communication and computing systems Also relaxed computation and delay constraints to discover a fundamental limit: capacity, providing a goal-post to work toward Information theory Finding fundamental limits for reliable information transmission Saturday, June 11, 011 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

4 Transmission of Information Shannon s Figure 1 INFORMATION SOURCE TRANSMITTER RECEIVER DESTINATION SIGNAL RECEIVED SIGNAL MESSAGE MESSAGE NOISE SOURCE Shannon abstracted away information meaning, semantics treat all data equally Shannon s bits as a universal Figure currency 1 crucial abstraction for modern communication and computing systems Also relaxed computation and delay constraints to discover a fundamental limit: capacity, providing a goal-post to work toward Information theory Finding fundamental limits for reliable information transmission Saturday, June 11, 011 Channel coding: Concerned with the maximum rate of communication in bits/channel use Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

5 Channel Coding (One-Shot) M X Y M e W d A code is an triple C = {M, e, d} where M is the message set Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

6 Channel Coding (One-Shot) M X Y M e W d A code is an triple C = {M, e, d} where M is the message set The average error probability p err (C) is p err (C) := Pr [ M M] where M is uniform on M Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

7 Channel Coding (One-Shot) M X Y M e W d A code is an triple C = {M, e, d} where M is the message set The average error probability p err (C) is p err (C) := Pr [ M M] where M is uniform on M ε-error Capacity is M (W, ε) := sup { m N C s.t. m = M, p err (C) ε } Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

8 Channel Coding (n-shot) M X n Y e W n n M d Consider n independent uses of a channel Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

9 Channel Coding (n-shot) M X n Y e W n n M d Consider n independent uses of a channel Assume W is a discrete memoryless channel Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

10 Channel Coding (n-shot) M X n Y e W n n M d Consider n independent uses of a channel Assume W is a discrete memoryless channel For vectors x = (x 1,..., x n ) X n and y := (y 1,..., y n ) Y n, n W n (y x) = W(y i x i ) i=1 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

11 Channel Coding (n-shot) M X n Y e W n n M d Consider n independent uses of a channel Assume W is a discrete memoryless channel For vectors x = (x 1,..., x n ) X n and y := (y 1,..., y n ) Y n, n W n (y x) = W(y i x i ) i=1 Blocklength n, ε-error Capacity is M (W n, ε) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

12 Main Contribution Upper bound log M (W n, ε) for n large (converse) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

13 Main Contribution Upper bound log M (W n, ε) for n large (converse) Concerned with the third-order term of the asymptotic expansion Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

14 Main Contribution Upper bound log M (W n, ε) for n large (converse) Concerned with the third-order term of the asymptotic expansion Going beyond the normal approximation terms Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

15 Main Contribution Upper bound log M (W n, ε) for n large (converse) Concerned with the third-order term of the asymptotic expansion Going beyond the normal approximation terms Theorem (Tomamichel-Tan (013)) For all DMCs with positive ε-dispersion V ε, where Q(a) := + a log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) 1 π exp ( 1 x) dx Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

16 Main Contribution Upper bound log M (W n, ε) for n large (converse) Concerned with the third-order term of the asymptotic expansion Going beyond the normal approximation terms Theorem (Tomamichel-Tan (013)) For all DMCs with positive ε-dispersion V ε, where Q(a) := + a log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) 1 π exp ( 1 x) dx The 1 log n term is our main contribution Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

17 Main Contribution: Remarks Our bound log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

18 Main Contribution: Remarks Our bound log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) Best upper bound till date: log M (W n, ε) nc nv ε Q 1 (ε) + ( X 1 ) log n + O(1) V. Strassen (1964) Polyanskiy-Poor-Verdú or PPV (010) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

19 Main Contribution: Remarks Our bound log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) Best upper bound till date: log M (W n, ε) nc nv ε Q 1 (ε) + ( X 1 ) log n + O(1) V. Strassen (1964) Polyanskiy-Poor-Verdú or PPV (010) Requires new converse techniques Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

20 Outline 1 Background Related work 3 Main result 4 New converse 5 Proof sketch 6 Summary and open problems Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

21 Background: Shannon s Channel Coding Theorem Shannon s noisy channel coding theorem and Wolfowitz s strong converse state that Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

22 Background: Shannon s Channel Coding Theorem Shannon s noisy channel coding theorem and Wolfowitz s strong converse state that Theorem (Shannon (1949), Wolfowitz (1959)) 1 lim n n log M (W n, ε) = C, ε (0, 1) where C is the channel capacity defined as C = C(W) = max I(P, W) P Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

23 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε) = C bits/channel use Noisy channel coding theorem is independent of ε (0, 1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

24 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε) = C bits/channel use Noisy channel coding theorem is independent of ε (0, 1) 1 lim p err(c) n 0 C R Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

25 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε) = C bits/channel use Noisy channel coding theorem is independent of ε (0, 1) 1 lim p err(c) n 0 C R Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

26 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε) = C bits/channel use Noisy channel coding theorem is independent of ε (0, 1) 1 lim p err(c) n 0 C R Phase transition at capacity Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

27 Background: ε-dispersion What happens at capacity? Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

28 Background: ε-dispersion What happens at capacity? More precisely, what happens when log M nc + a n for some a R? Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

29 Background: ε-dispersion What happens at capacity? More precisely, what happens when log M nc + a n for some a R? Assume capacity-achieving input distribution (CAID) P is unique Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

30 Background: ε-dispersion What happens at capacity? More precisely, what happens when for some a R? log M nc + a n Assume capacity-achieving input distribution (CAID) P is unique The ε-dispersion is an operational quantity that is equal to ( V ε = V(P, W) = E P [Var W( X) log W( X) )] X Q ( ) where (X, Y) P W and Q (y) = x P (x)w(y x) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

31 Background: ε-dispersion What happens at capacity? More precisely, what happens when for some a R? log M nc + a n Assume capacity-achieving input distribution (CAID) P is unique The ε-dispersion is an operational quantity that is equal to ( V ε = V(P, W) = E P [Var W( X) log W( X) )] X Q ( ) where (X, Y) P W and Q (y) = x P (x)w(y x) Since CAID is unique, V ε = V Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

32 Background: ε-dispersion Assume rate of the code satisfies 1 a log M = C + n n Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

33 Background: ε-dispersion Assume rate of the code satisfies 1 a log M = C + n n 1 lim perr(c) n a Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

34 Background: ε-dispersion Assume rate of the code satisfies 1 a log M = C + n n 1 lim perr(c) n 0.5 ( p err(c) Φ ) a V 0 a Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

35 Background: ε-dispersion Assume rate of the code satisfies 1 a log M = C + n n 1 lim perr(c) n 0.5 ( p err(c) Φ ) a V 0 a Here, we have fixed a, the second-order coding rate [Hayashi (009)] Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

36 Background: ε-dispersion Theorem (Strassen (1964), Hayashi (009), Polyanskiy-Poor-Verdú (010)) For every ε (0, 1), and if V ε > 0, we have log M (W n, ε) = nc nvq 1 (ε) + O(log n) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

37 Background: ε-dispersion Theorem (Strassen (1964), Hayashi (009), Polyanskiy-Poor-Verdú (010)) For every ε (0, 1), and if V ε > 0, we have log M (W n, ε) = nc nvq 1 (ε) + O(log n) V. Strassen (1964) M. Hayashi (009) Polyanskiy-Poor-Verdú (010) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

38 Background: ε-dispersion Berry-Esséen theorem: For independent X i with zero-mean and variances σ i, ( 1 P n ) n X i a i=1 ( ā ) = Q ± 6 B σ n where σ = 1 n n i=1 σ i and B is related to the third moment Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

39 Background: ε-dispersion Berry-Esséen theorem: For independent X i with zero-mean and variances σ i, ( 1 P n ) n X i a i=1 ( ā ) = Q ± 6 B σ n where σ = 1 n n i=1 σ i and B is related to the third moment PPV showed that the normal approximation log M (W n, ε) nc nvq 1 (ε) is very accurate even at moderate blocklengths of 100 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

40 Background: ε-dispersion for the BSC For a BSC with crossover probability p = 0.11, the normal approximation yields: Normal approximation 0.5 Bits per channel use Capacity ε = 0.01 ε = Blocklength n Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

41 Related Work: Third-Order Term Recall that we are interested in quantifying the third-order term ρ n ρ n = log M (W n, ε) [ nc nvq 1 (ε) ] ρ n = O(log n) if channel is non-exotic Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

42 Related Work: Third-Order Term Recall that we are interested in quantifying the third-order term ρ n ρ n = log M (W n, ε) [ nc nvq 1 (ε) ] ρ n = O(log n) if channel is non-exotic Motivation 1: ρ n may be important at very short blocklengths Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

43 Related Work: Third-Order Term Recall that we are interested in quantifying the third-order term ρ n ρ n = log M (W n, ε) [ nc nvq 1 (ε) ] ρ n = O(log n) if channel is non-exotic Motivation 1: ρ n may be important at very short blocklengths Motivation : Because we re information theorists Wir müssen wissen wir werden wissen (David Hilbert) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

44 Related Work: Third-Order Term ρ n = log M (W n, ε) [ nc nvq 1 (ε) ] For the BSC [PPV10] ρ n = 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

45 Related Work: Third-Order Term ρ n = log M (W n, ε) [ nc nvq 1 (ε) ] For the BSC [PPV10] ρ n = 1 log n + O(1) For the BEC [PPV10] ρ n = O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

46 Related Work: Third-Order Term ρ n = log M (W n, ε) [ nc nvq 1 (ε) ] For the BSC [PPV10] ρ n = 1 log n + O(1) For the BEC [PPV10] ρ n = O(1) For the AWGN under maximum-power constraints [PPV10] O(1) ρ n 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

47 Related Work: Third-Order Term ρ n = log M (W n, ε) [ nc nvq 1 (ε) ] For the BSC [PPV10] ρ n = 1 log n + O(1) For the BEC [PPV10] ρ n = O(1) For the AWGN under maximum-power constraints [PPV10] O(1) ρ n 1 log n + O(1) Our converse technique can be applied to the AWGN channel Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

48 Related Work: Achievability for Third-Order Term Proposition (Polyanskiy (010)) Assume that all elements of {W(y x) : x X, y Y} are positive and C > 0. Then, ρ n 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

49 Related Work: Achievability for Third-Order Term Proposition (Polyanskiy (010)) Assume that all elements of {W(y x) : x X, y Y} are positive and C > 0. Then, ρ n 1 log n + O(1) This is an achievability result Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

50 Related Work: Achievability for Third-Order Term Proposition (Polyanskiy (010)) Assume that all elements of {W(y x) : x X, y Y} are positive and C > 0. Then, ρ n 1 log n + O(1) This is an achievability result BEC doesn t satisfy assumptions Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

51 Related Work: Achievability for Third-Order Term Proposition (Polyanskiy (010)) Assume that all elements of {W(y x) : x X, y Y} are positive and C > 0. Then, ρ n 1 log n + O(1) This is an achievability result BEC doesn t satisfy assumptions We will not try to improve on it Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

52 Related Work: Converse for Third-Order Term Proposition (Polyanskiy (010)) If W is weakly input-symmetric ρ n 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

53 Related Work: Converse for Third-Order Term Proposition (Polyanskiy (010)) If W is weakly input-symmetric ρ n 1 log n + O(1) This is a converse result Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

54 Related Work: Converse for Third-Order Term Proposition (Polyanskiy (010)) If W is weakly input-symmetric ρ n 1 log n + O(1) This is a converse result Gallager-symmetric channels are weakly input-symmetric Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

55 Related Work: Converse for Third-Order Term Proposition (Polyanskiy (010)) If W is weakly input-symmetric ρ n 1 log n + O(1) This is a converse result Gallager-symmetric channels are weakly input-symmetric The set of weakly input-symmetric channels is very thin Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

56 Related Work: Converse for Third-Order Term Proposition (Polyanskiy (010)) If W is weakly input-symmetric ρ n 1 log n + O(1) This is a converse result Gallager-symmetric channels are weakly input-symmetric The set of weakly input-symmetric channels is very thin We dispense of this symmetry assumption Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

57 Related Work: Converse for Third-Order Term Proposition (Strassen (1964), PPV (010)) If W is a DMC with positive ε-dispersion, ( ρ n X 1 ) log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

58 Related Work: Converse for Third-Order Term Proposition (Strassen (1964), PPV (010)) If W is a DMC with positive ε-dispersion, ( ρ n X 1 ) log n + O(1) Every code can be partitioned into no more than (n + 1) X 1 constant-composition subcodes Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

59 Related Work: Converse for Third-Order Term Proposition (Strassen (1964), PPV (010)) If W is a DMC with positive ε-dispersion, ( ρ n X 1 ) log n + O(1) Every code can be partitioned into no more than (n + 1) X 1 constant-composition subcodes M P (Wn, ε): Max size of a constant-composition code with type P Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

60 Related Work: Converse for Third-Order Term Proposition (Strassen (1964), PPV (010)) If W is a DMC with positive ε-dispersion, ( ρ n X 1 ) log n + O(1) Every code can be partitioned into no more than (n + 1) X 1 constant-composition subcodes M P (Wn, ε): Max size of a constant-composition code with type P As such, M (W n X 1, ε) (n + 1) max P P n(x ) M P(W n, ε) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

61 Related Work: Converse for Third-Order Term Proposition (Strassen (1964), PPV (010)) If W is a DMC with positive ε-dispersion, ( ρ n X 1 ) log n + O(1) Every code can be partitioned into no more than (n + 1) X 1 constant-composition subcodes M P (Wn, ε): Max size of a constant-composition code with type P As such, M (W n X 1, ε) (n + 1) max P P n(x ) M P(W n, ε) This is where the dependence on X comes in Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

62 Main Result: Tight Third-Order Term Theorem (Tomamichel-Tan (013)) If W is a DMC with positive ε-dispersion, ρ n 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

63 Main Result: Tight Third-Order Term Theorem (Tomamichel-Tan (013)) If W is a DMC with positive ε-dispersion, ρ n 1 log n + O(1) The 1 cannot be improved without further assumptions Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

64 Main Result: Tight Third-Order Term Theorem (Tomamichel-Tan (013)) If W is a DMC with positive ε-dispersion, ρ n 1 log n + O(1) The 1 cannot be improved without further assumptions For BSC ρ n = 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

65 Main Result: Tight Third-Order Term Theorem (Tomamichel-Tan (013)) If W is a DMC with positive ε-dispersion, ρ n 1 log n + O(1) The 1 cannot be improved without further assumptions For BSC ρ n = 1 log n + O(1) We can dispense of the positive ε-dispersion assumption as well Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

66 Main Result: Tight Third-Order Term Theorem (Tomamichel-Tan (013)) If W is a DMC with positive ε-dispersion, ρ n 1 log n + O(1) The 1 cannot be improved without further assumptions For BSC ρ n = 1 log n + O(1) We can dispense of the positive ε-dispersion assumption as well No need for unique CAID Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

67 Main Result: Tight Third-Order Term All cases are covered V ε > 0 nc nv εq 1 (ε)+ 1 log n+o(1) Yes nc+o(1) Yes No not exotic 7 or ε< 1 nc+ 1 log n+o(1) Yes No 7 exotic and ε= 1 No 7 nc+o ( ) n 1 3 [PPV10] Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop 013 / 9

68 Proof Technique for Tight Third-Order Term For the regular case, ρ n 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

69 Proof Technique for Tight Third-Order Term For the regular case, ρ n 1 log n + O(1) The type-counting trick and upper bounds on M P (Wn, ε) are not sufficiently tight Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

70 Proof Technique for Tight Third-Order Term For the regular case, ρ n 1 log n + O(1) The type-counting trick and upper bounds on M P (Wn, ε) are not sufficiently tight We need a new converse bound for general DMCs Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

71 Proof Technique for Tight Third-Order Term For the regular case, ρ n 1 log n + O(1) The type-counting trick and upper bounds on M P (Wn, ε) are not sufficiently tight We need a new converse bound for general DMCs Information spectrum divergence D ε s(p Q) := sup {R R ( P log P(X) ) Q(X) R } ε Information Spectrum Methods in Information Theory by T. S. Han (003) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

72 Proof Technique: Information Spectrum Divergence { D ε s(p Q) := sup R R ( P log P(X) ) } Q(X) R ε Density of log P(X) Q(X) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

73 Proof Technique: Information Spectrum Divergence { D ε s(p Q) := sup R R ( P log P(X) ) } Q(X) R ε ε Density of log P(X) Q(X) R Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

74 Proof Technique: Information Spectrum Divergence { D ε s(p Q) := sup R R ( P log P(X) ) } Q(X) R ε ε 1 ε Density of log P(X) Q(X) R Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

75 Proof Technique: Information Spectrum Divergence { D ε s(p Q) := sup R R ( P log P(X) ) } Q(X) R ε ε 1 ε Density of log P(X) Q(X) R If X n is i.i.d. P, the central limit theorem yields D ε s(p n Q n ) nd(p Q) nv(p Q)Q 1 (ε) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

76 Proof Technique: The New Converse Bound Lemma (Tomamichel-Tan (013)) For every channel W, every ε (0, 1) and δ (0, 1 ε), we have log M (W, ε) min max Q P(Y) x X Dε+δ s (W( x) Q) + log 1 δ Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

77 Proof Technique: The New Converse Bound Lemma (Tomamichel-Tan (013)) For every channel W, every ε (0, 1) and δ (0, 1 ε), we have log M (W, ε) min max Q P(Y) x X Dε+δ s (W( x) Q) + log 1 δ When DMC is used n times, log M (W n, ε) min max Q (n) P(Y n ) x X Dε+δ n s (W n ( x) Q (n) ) + log 1 δ Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

78 Proof Technique: The New Converse Bound Lemma (Tomamichel-Tan (013)) For every channel W, every ε (0, 1) and δ (0, 1 ε), we have log M (W, ε) min max Q P(Y) x X Dε+δ s (W( x) Q) + log 1 δ When DMC is used n times, log M (W n, ε) min max Q (n) P(Y n ) x X Dε+δ n s (W n ( x) Q (n) ) + log 1 δ Choose δ = n 1 so log 1 δ = 1 log n Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

79 Proof Technique: The New Converse Bound Lemma (Tomamichel-Tan (013)) For every channel W, every ε (0, 1) and δ (0, 1 ε), we have log M (W, ε) min max Q P(Y) x X Dε+δ s (W( x) Q) + log 1 δ When DMC is used n times, log M (W n, ε) min max Q (n) P(Y n ) x X Dε+δ n s (W n ( x) Q (n) ) + log 1 δ Choose δ = n 1 so log 1 δ = 1 log n Since all x within a type class result in the same D ε+δ s (if Q (n) is permutation invariant), it s really a max over types P x P n (X ) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

80 Proof Technique: Choice of Output Distribution log M (W n, ε) max x X n Dε+δ s (W n ( x) Q (n) ) + log 1 δ, Q(n) P(Y n ) Q (n) (y): invariant to permutations of the n channel uses Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

81 Proof Technique: Choice of Output Distribution log M (W n, ε) max x X n Dε+δ s (W n ( x) Q (n) ) + log 1 δ, Q(n) P(Y n ) Q (n) (y): invariant to permutations of the n channel uses Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) 1 P n (X ) (PW)n (y) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

82 Proof Technique: Choice of Output Distribution log M (W n, ε) max x X n Dε+δ s (W n ( x) Q (n) ) + log 1 δ, Q(n) P(Y n ) Q (n) (y): invariant to permutations of the n channel uses Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) 1 P n (X ) (PW)n (y) First term: Q k s and λ(k) s designed to form an n 1 -cover of P(Y): Q P(Y), k K s.t. Q Q k n 1. Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

83 Proof Technique: Choice of Output Distribution log M (W n, ε) max x X n Dε+δ s (W n ( x) Q (n) ) + log 1 δ, Q(n) P(Y n ) Q (n) (y): invariant to permutations of the n channel uses Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) 1 P n (X ) (PW)n (y) First term: Q k s and λ(k) s designed to form an n 1 -cover of P(Y): Q P(Y), k K s.t. Q Q k n 1. Second term: Mixture over output distributions induced by input types [Hayashi (009)] Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

84 Proof Technique: Choice of Output Distribution Q (n) (y) := 1 λ(k)q n k (y) + 1 k K Q(1) (1, 0) P(Y) P P n(x ) 1 P n (X ) (PW)n (y) Q(0) (0, 1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

85 Proof Technique: Choice of Output Distribution Q (n) (y) := 1 λ(k)q n k (y) + 1 k K Q(1) (1, 0) P(Y) P P n(x ) 1 P n (X ) (PW)n (y) Q Q(0) (0, 1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

86 Proof Technique: Choice of Output Distribution Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) Q(1) (1, 0) P(Y) Q Q(0) (0, 1) 1 P n (X ) (PW)n (y) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

87 Proof Technique: Choice of Output Distribution Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) Q(1) (1, 0) P(Y) Q [,] Q [ 1,1] Q Q [1, 1] Q [, ] Q(0) (0, 1) 1 P n (X ) (PW)n (y) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

88 Proof Technique: Choice of Output Distribution Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) Q(1) (1, 0) P(Y) Q [,] Q [ 1,1] Q 1 n Q [1, 1] 1 n Q [, ] Q(0) (0, 1) 1 P n (X ) (PW)n (y) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

89 Proof Technique: Summary Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) 1 P n (X ) (PW)n (y) This construction ensures that for every type P x near the CAID is well-approximated by by a Q k(x) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

90 Proof Technique: Summary Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) 1 P n (X ) (PW)n (y) This construction ensures that for every type P x near the CAID is well-approximated by by a Q k(x) Well in the sense that the loss is log λ(k) = O(1) for every x such that P x is near the CAID Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

91 Proof Technique: Summary Q (n) (y) := 1 λ(k)q n k (y) + 1 k K P P n(x ) 1 P n (X ) (PW)n (y) This construction ensures that for every type P x near the CAID is well-approximated by by a Q k(x) Well in the sense that the loss is log λ(k) = O(1) for every x such that P x is near the CAID For types P x far from the CAID, use the second part and I(P x, W) C < C Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

92 Summary and Food for Thought We showed that for DMCs with positive ε-dispersion, log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

93 Summary and Food for Thought We showed that for DMCs with positive ε-dispersion, log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) How important is the assumption of discreteness? Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

94 Summary and Food for Thought We showed that for DMCs with positive ε-dispersion, log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) How important is the assumption of discreteness? Does our uniform quantization technique extend to lossy source coding? [Ingber-Kochman (010), Kostina-Verdú (01)] Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

95 Summary and Food for Thought We showed that for DMCs with positive ε-dispersion, log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) How important is the assumption of discreteness? Does our uniform quantization technique extend to lossy source coding? [Ingber-Kochman (010), Kostina-Verdú (01)] Alternate proof using Bahadur-Ranga Rao [Moulin (01)]? ( ) 1 n ( ) exp( ni(c)) P X i c = Θ n n i=1 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

96 Summary and Food for Thought We showed that for DMCs with positive ε-dispersion, log M (W n, ε) nc nv ε Q 1 (ε) + 1 log n + O(1) How important is the assumption of discreteness? Does our uniform quantization technique extend to lossy source coding? [Ingber-Kochman (010), Kostina-Verdú (01)] Alternate proof using Bahadur-Ranga Rao [Moulin (01)]? ( ) 1 n ( ) exp( ni(c)) P X i c = Θ n n i=1 This result has been used to refine the sphere-packing bound [Altug-Wagner (01)] Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs HKTW Workshop / 9

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

A Tight Upper Bound for the Third-Order Asymptotics for Most Discrete Memoryless Channels

A Tight Upper Bound for the Third-Order Asymptotics for Most Discrete Memoryless Channels A Tight Upper Bound for the Third-Order Asymptotics for Most Discrete Memoryless Channels Marco Tomamichel, Member, IEEE and Vincent Y. F. Tan, Member, IEEE Abstract arxiv:.3689v3 [cs.it] 4 Oct 03 This

More information

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

The PPM Poisson Channel: Finite-Length Bounds and Code Design

The PPM Poisson Channel: Finite-Length Bounds and Code Design August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs

More information

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo Classical communication over classical channels using non-classical correlation. Will Matthews IQC @ University of Waterloo 1 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Shannon s A Mathematical Theory of Communication

Shannon s A Mathematical Theory of Communication Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Semidefinite programming strong converse bounds for quantum channel capacities

Semidefinite programming strong converse bounds for quantum channel capacities Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Runyao Duan and Wei Xie arxiv:1610.06381 & 1601.06888

More information

A new converse in rate-distortion theory

A new converse in rate-distortion theory A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Converse bounds for private communication over quantum channels

Converse bounds for private communication over quantum channels Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

Covert Communication with Channel-State Information at the Transmitter

Covert Communication with Channel-State Information at the Transmitter Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

On the Error Exponents of ARQ Channels with Deadlines

On the Error Exponents of ARQ Channels with Deadlines On the Error Exponents of ARQ Channels with Deadlines Praveen Kumar Gopala, Young-Han Nam and Hesham El Gamal arxiv:cs/06006v [cs.it] 8 Oct 2006 March 22, 208 Abstract We consider communication over Automatic

More information

Quantum Achievability Proof via Collision Relative Entropy

Quantum Achievability Proof via Collision Relative Entropy Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822

More information

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels POST-PRIT OF THE IEEE TRAS. O IFORMATIO THEORY, VOL. 54, O. 5, PP. 96 99, MAY 8 An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels Gil Wiechman Igal Sason Department

More information

Memory in Classical Information Theory: A Brief History

Memory in Classical Information Theory: A Brief History Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

Fundamental Limits in Asynchronous Communication

Fundamental Limits in Asynchronous Communication Fundamental Limits in Asynchronous Communication Aslan Tchamkerten Telecom ParisTech V. Chandar D. Tse G. Wornell L. Li G. Caire D.E. Shaw Stanford MIT TPT TU Berlin What is asynchronism? When timing of

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Variable-length coding with feedback in the non-asymptotic regime

Variable-length coding with feedback in the non-asymptotic regime Variable-length coding with feedback in the non-asymptotic regime Yury Polyanskiy, H. Vincent Poor, and Sergio Verdú Abstract Without feedback, the backoff from capacity due to non-asymptotic blocklength

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Lossy Compression Coding Theorems for Arbitrary Sources

Lossy Compression Coding Theorems for Arbitrary Sources Lossy Compression Coding Theorems for Arbitrary Sources Ioannis Kontoyiannis U of Cambridge joint work with M. Madiman, M. Harrison, J. Zhang Beyond IID Workshop, Cambridge, UK July 23, 2018 Outline Motivation

More information

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff Transmitting samples over the Gaussian channel: energy-distortion tradeoff Victoria Kostina Yury Polyansiy Sergio Verdú California Institute of Technology Massachusetts Institute of Technology Princeton

More information

Delay, feedback, and the price of ignorance

Delay, feedback, and the price of ignorance Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Lecture 11: Continuous-valued signals and differential entropy

Lecture 11: Continuous-valued signals and differential entropy Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Capacity of AWGN channels

Capacity of AWGN channels Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that

More information

Recent Results on Input-Constrained Erasure Channels

Recent Results on Input-Constrained Erasure Channels Recent Results on Input-Constrained Erasure Channels A Case Study for Markov Approximation July, 2017@NUS Memoryless Channels Memoryless Channels Channel transitions are characterized by time-invariant

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012 Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM

More information

Lecture 9 Polar Coding

Lecture 9 Polar Coding Lecture 9 Polar Coding I-Hsiang ang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 29, 2015 1 / 25 I-Hsiang ang IT Lecture 9 In Pursuit of Shannon s Limit Since

More information

Variable Rate Channel Capacity. Jie Ren 2013/4/26

Variable Rate Channel Capacity. Jie Ren 2013/4/26 Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons

More information

The Compound Capacity of Polar Codes

The Compound Capacity of Polar Codes The Compound Capacity of Polar Codes S. Hamed Hassani, Satish Babu Korada and Rüdiger Urbanke arxiv:97.329v [cs.it] 9 Jul 29 Abstract We consider the compound capacity of polar codes under successive cancellation

More information

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

New communication strategies for broadcast and interference networks

New communication strategies for broadcast and interference networks New communication strategies for broadcast and interference networks S. Sandeep Pradhan (Joint work with Arun Padakandla and Aria Sahebi) University of Michigan, Ann Arbor Distributed Information Coding

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

A Finite Block Length Achievability Bound for Low Probability of Detection Communication

A Finite Block Length Achievability Bound for Low Probability of Detection Communication A Finite Block Length Achievability Bound for Low Probability of Detection Communication Nick Letzepis, Member, IEEE. arxiv:80.0507v [cs.it] 30 Jul 08 Abstract Low probability of detection (or covert)

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Masahito Hayashi and Shun Watanabe Graduate School of Mathematics, Nagoya University, Japan, and Centre for Quantum Technologies,

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Entropy Vectors and Network Information Theory

Entropy Vectors and Network Information Theory Entropy Vectors and Network Information Theory Sormeh Shadbakht and Babak Hassibi Department of Electrical Engineering Caltech Lee Center Workshop May 25, 2007 1 Sormeh Shadbakht and Babak Hassibi Entropy

More information

Lecture 6 Channel Coding over Continuous Channels

Lecture 6 Channel Coding over Continuous Channels Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have

More information

Belief Propagation, Information Projections, and Dykstra s Algorithm

Belief Propagation, Information Projections, and Dykstra s Algorithm Belief Propagation, Information Projections, and Dykstra s Algorithm John MacLaren Walsh, PhD Department of Electrical and Computer Engineering Drexel University Philadelphia, PA jwalsh@ece.drexel.edu

More information

5. Density evolution. Density evolution 5-1

5. Density evolution. Density evolution 5-1 5. Density evolution Density evolution 5-1 Probabilistic analysis of message passing algorithms variable nodes factor nodes x1 a x i x2 a(x i ; x j ; x k ) x3 b x4 consider factor graph model G = (V ;

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Chapter 2 Review of Classical Information Theory

Chapter 2 Review of Classical Information Theory Chapter 2 Review of Classical Information Theory Abstract This chapter presents a review of the classical information theory which plays a crucial role in this thesis. We introduce the various types of

More information

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019 Lecture 11: Information theory DANIEL WELLER THURSDAY, FEBRUARY 21, 2019 Agenda Information and probability Entropy and coding Mutual information and capacity Both images contain the same fraction of black

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels Gil Wiechman Igal Sason Department of Electrical Engineering Technion, Haifa 3000, Israel {igillw@tx,sason@ee}.technion.ac.il

More information

On Composite Quantum Hypothesis Testing

On Composite Quantum Hypothesis Testing University of York 7 November 207 On Composite Quantum Hypothesis Testing Mario Berta Department of Computing with Fernando Brandão and Christoph Hirche arxiv:709.07268 Overview Introduction 2 Composite

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Semidefinite programming strong converse bounds for quantum channel capacities

Semidefinite programming strong converse bounds for quantum channel capacities Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Wei Xie, Runyao Duan (UTS:QSI) QIP 2017, Microsoft

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information