A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

Size: px
Start display at page:

Download "A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback"

Transcription

1 A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung, Taiwan Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

2 Channel Model: The Parallel Gaussian Channel Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

3 Channel Model: The Parallel Gaussian Channel Channel Law (g 1 = g 2 =... = g d = 1) Y k = X k + Z k, Y 1,k Y 2,k. Y d,k = X 1,k X 2,k and Z l,k i.i.d. N (0, N l ) where l [1 : d].. X d,k + Z 1,k Z 2,k. Z d,k, k [1 : n] Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

4 Capacity of the Parallel Gaussian Channel Consider a peak power constraint { } 1 d n Pr Xl,k 2 P = 1 n l=1 k=1 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

5 Capacity of the Parallel Gaussian Channel Consider a peak power constraint { } 1 d n Pr Xl,k 2 P = 1 n l=1 k=1 Define the capacity functions d ( ) sl C(s) = C, where C(P) := 1 log(1 + P). 2 l=1 N l Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

6 Capacity of the Parallel Gaussian Channel Consider a peak power constraint { } 1 d n Pr Xl,k 2 P = 1 n l=1 k=1 Define the capacity functions d ( ) sl C(s) = C, where C(P) := 1 log(1 + P). 2 l=1 N l Vanishing-error capacity is given by the water-filling solution and C(P ), where P = (P 1, P 2,..., P d) P l = (λ N l ) + and λ > 0 satisfies d P l = P. l=1 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

7 Capacity vs. Non-Asymptotic Fundamental Limits Define the non-asymptotic fundamental limit M (n, ε, P) := max{m N : (n, M, ε, P)-code}. Maximum number of messages that can be supported by a length-n code with peak power P and average error prob. ε. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

8 Capacity vs. Non-Asymptotic Fundamental Limits Define the non-asymptotic fundamental limit M (n, ε, P) := max{m N : (n, M, ε, P)-code}. Maximum number of messages that can be supported by a length-n code with peak power P and average error prob. ε. Capacity result can be stated as 1 lim lim inf ε 0 n n log M (n, ε, P) = C(P ). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

9 Capacity vs. Non-Asymptotic Fundamental Limits Define the non-asymptotic fundamental limit M (n, ε, P) := max{m N : (n, M, ε, P)-code}. Maximum number of messages that can be supported by a length-n code with peak power P and average error prob. ε. Capacity result can be stated as 1 lim lim inf ε 0 n n log M (n, ε, P) = C(P ). In fact, the strong converse holds, and we have lim inf n 1 n log M (n, ε, P) = C(P ), ε (0, 1). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

10 Strong Converse ε = lim sup P (n) 1 e, R = lim inf n n n log M (n, ε, P) ε 0.5 Peak C(P ) R Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

11 Second-Order Asymptotics Let V(P) := P(P + 2) 2(P + 1) 2 be the dispersion of the point-to-point AWGN channel. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

12 Second-Order Asymptotics Let V(P) := P(P + 2) 2(P + 1) 2 be the dispersion of the point-to-point AWGN channel. Define the sum dispersion function to be d ( ) sl V(s) := V. l=1 N l Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

13 Second-Order Asymptotics Let V(P) := P(P + 2) 2(P + 1) 2 be the dispersion of the point-to-point AWGN channel. Define the sum dispersion function to be d ( ) sl V(s) := V. l=1 Then Polyanskiy (2010) and Tomamichel-T. (2015) showed that log M (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + O(log n). where P = (P 1, P 2,..., P d ) is the optimal power allocation and b 1 ( Φ(b) = exp u2 ) du. 2π 2 N l Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

14 Feedback If feedback is present, then the encoder at time k has access to message W and previous channel outputs (Y 1, Y 2,..., Y k 1 ), i.e., X k = X k (W, Y k 1 ), k [1 : n]. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

15 Feedback If feedback is present, then the encoder at time k has access to message W and previous channel outputs (Y 1, Y 2,..., Y k 1 ), i.e., X k = X k (W, Y k 1 ), k [1 : n]. Define the non-asymptotic fundamental limit MFB (n, ε, P) := max{m N : (n, M, ε, P)-feedback code}. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

16 Feedback If feedback is present, then the encoder at time k has access to message W and previous channel outputs (Y 1, Y 2,..., Y k 1 ), i.e., X k = X k (W, Y k 1 ), k [1 : n]. Define the non-asymptotic fundamental limit MFB (n, ε, P) := max{m N : (n, M, ε, P)-feedback code}. Since feedback does not increase capacity of point-to-point memoryless channels [Shannon (1956)], 1 lim lim inf ε 0 n n log M FB (n, ε, P) = C(P ). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

17 Feedback If feedback is present, then the encoder at time k has access to message W and previous channel outputs (Y 1, Y 2,..., Y k 1 ), i.e., X k = X k (W, Y k 1 ), k [1 : n]. Define the non-asymptotic fundamental limit MFB (n, ε, P) := max{m N : (n, M, ε, P)-feedback code}. Since feedback does not increase capacity of point-to-point memoryless channels [Shannon (1956)], 1 lim lim inf ε 0 n n log M FB (n, ε, P) = C(P ). Natural question: What happens to the second-order terms? Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

18 Main Result Theorem (Fong-T. (2017)) Feedback does not affect the second-order term for parallel Gaussian channels with feedback, i.e., log M FB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

19 Main Result Theorem (Fong-T. (2017)) Feedback does not affect the second-order term for parallel Gaussian channels with feedback, i.e., log M FB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Recall that without feedback log M (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + O(log n). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

20 Main Result Theorem (Fong-T. (2017)) Feedback does not affect the second-order term for parallel Gaussian channels with feedback, i.e., log M FB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Recall that without feedback log M (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + O(log n). No guarantees on third-order term. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

21 Main Result Theorem (Fong-T. (2017)) Feedback does not affect the second-order term for parallel Gaussian channels with feedback, i.e., log M FB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Recall that without feedback log M (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + O(log n). No guarantees on third-order term. For achievability, encoder can ignore feedback. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

22 Main Result Theorem (Fong-T. (2017)) Feedback does not affect the second-order term for parallel Gaussian channels with feedback, i.e., log M FB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Recall that without feedback log M (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + O(log n). No guarantees on third-order term. For achievability, encoder can ignore feedback. Only need to prove converse part. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

23 Comparison to AWGN channel with feedback For the AWGN channel with feedback (d = 1), information density log pn Y X (Yn x n ) p n Y (Yn ) has the same distribution for all x n such that x n 2 = np. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

24 Comparison to AWGN channel with feedback For the AWGN channel with feedback (d = 1), information density log pn Y X (Yn x n ) p n Y (Yn ) = Yn x n 2 2N + c has the same distribution for all x n such that x n 2 = np. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

25 Comparison to AWGN channel with feedback For the AWGN channel with feedback (d = 1), information density log pn Y X (Yn x n ) p n Y (Yn ) = Yn x n 2 2N = xn + Z n, x n N + c + c has the same distribution for all x n such that x n 2 = np. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

26 Comparison to AWGN channel with feedback For the AWGN channel with feedback (d = 1), information density log pn Y X (Yn x n ) p n Y (Yn ) = Yn x n 2 + c 2N = xn + Z n, x n + c N = 1 n Z k x k + c N k=1 has the same distribution for all x n such that x n 2 = np. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

27 Comparison to AWGN channel with feedback For the AWGN channel with feedback (d = 1), information density log pn Y X (Yn x n ) p n Y (Yn ) = Yn x n 2 + c 2N = xn + Z n, x n + c N = 1 n Z k x k + c N k=1 has the same distribution for all x n such that x n 2 = np. This spherical symmetry argument [Polyanskiy (2011, 2013) and Fong-T. (2015)] yields log M FB (n, ε, P) = nc(p) + nv(p)φ 1 (ε) + O(log n). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

28 Comparison to AWGN channel with feedback For the AWGN channel with feedback (d = 1), information density log pn Y X (Yn x n ) p n Y (Yn ) = Yn x n 2 + c 2N = xn + Z n, x n + c N = 1 n Z k x k + c N k=1 has the same distribution for all x n such that x n 2 = np. This spherical symmetry argument [Polyanskiy (2011, 2013) and Fong-T. (2015)] yields log M FB (n, ε, P) = nc(p) + nv(p)φ 1 (ε) + O(log n). Symmetry argument doesn t work for parallel Gaussians. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

29 Elements of Converse Proof Want to show lim sup n log M FB (n, ε, P) nc(p ) n V(P )Φ 1 (ε). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

30 Elements of Converse Proof Want to show lim sup n log M FB (n, ε, P) nc(p ) n V(P )Φ 1 (ε). Discretizing the power allocation: Turn power allocations into power types Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

31 Elements of Converse Proof Want to show lim sup n log M FB (n, ε, P) nc(p ) n V(P )Φ 1 (ε). Discretizing the power allocation: Turn power allocations into power types Apply the meta-converse [Polyanskiy-Poor-Verdú (2010)] Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

32 Elements of Converse Proof Want to show lim sup n log M FB (n, ε, P) nc(p ) n V(P )Φ 1 (ε). Discretizing the power allocation: Turn power allocations into power types Apply the meta-converse [Polyanskiy-Poor-Verdú (2010)] Apply Curtiss theorem to the information spectrum term for close-to-optimal power types Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

33 Elements of Converse Proof Want to show lim sup n log M FB (n, ε, P) nc(p ) n V(P )Φ 1 (ε). Discretizing the power allocation: Turn power allocations into power types Apply the meta-converse [Polyanskiy-Poor-Verdú (2010)] Apply Curtiss theorem to the information spectrum term for close-to-optimal power types Apply large deviations bounds to far-from-optimal power types Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

34 Turn Power Allocation into Types: Part I Given a channel input x n R d n, define its power type to be φ(x n ) = 1 n [ n = 1 n [ x n 1 2, x n 2 2,..., x n d 2] k=1 x 2 1,k, n x2,k, 2..., n k=1 k=1 x 2 d,k ] R d +. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

35 Turn Power Allocation into Types: Part I Given a channel input x n R d n, define its power type to be φ(x n ) = 1 n [ n = 1 n [ x n 1 2, x n 2 2,..., x n d 2] k=1 x 2 1,k, n x2,k, 2..., n k=1 k=1 x 2 d,k ] R d +. Create a new code such that almost surely, n Xl,k 2 = m l P, for some m l [1 : n] and l [1 : d] k=1 and d n Xl,k 2 = np, l=1 k=1 i.e., d m l = n. l=1 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

36 Turn Power Allocation into Types: Part II Power allocation vector set { S (n) P := n a a = (a 1, a 2,..., a d ) Z d +, } d a l = n l=1 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

37 Turn Power Allocation into Types: Part II Power allocation vector set { } S (n) P := n a d a = (a 1, a 2,..., a d ) Z d +, a l = n l=1 a 2 (0, n) P/n P/n a 1 (n, 0) Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

38 Turn Power Allocation into Types: Part III Set of quantized power allocation vectors s with quantization level P/n that satisfy the equality power constraint d s l = P. l=1 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

39 Turn Power Allocation into Types: Part III Set of quantized power allocation vectors s with quantization level P/n that satisfy the equality power constraint d s l = P. l=1 With the above construction, almost surely, φ(x n ) S (n). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

40 Turn Power Allocation into Types: Part III Set of quantized power allocation vectors s with quantization level P/n that satisfy the equality power constraint d s l = P. l=1 With the above construction, almost surely, φ(x n ) S (n). So we can pretend that the codewords are discrete and leverage some existing theory from second-order analysis for DMCs. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

41 Application of the Meta-Converse (PPV 10) By a relaxation of the meta-converse, for any ξ > 0, { log MFB (1 (n, ε, P) log ξ log ε Pr log pn Y X (Yn X n ) q Y n(y n ) log ξ }) Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

42 Application of the Meta-Converse (PPV 10) By a relaxation of the meta-converse, for any ξ > 0, { log MFB (1 (n, ε, P) log ξ log ε Pr log pn Y X (Yn X n ) q Y n(y n ) log ξ }) Choose auxiliary output distribution q Y n carefully. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

43 Application of the Meta-Converse (PPV 10) By a relaxation of the meta-converse, for any ξ > 0, { log MFB (1 (n, ε, P) log ξ log ε Pr log pn Y X (Yn X n ) q Y n(y n ) log ξ }) Choose auxiliary output distribution q Y n carefully. Inspired by Hayashi (2009), we choose q Y n(y n )= S (n) N (y l,k ; 0, s l +N l ) N (y l,k ; 0, P l +N l ) s S (n) l,k l,k }{{}}{{} q (1) Y n (y n ) q (2) Y n (y n ) Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

44 Application of the Meta-Converse (PPV 10) By a relaxation of the meta-converse, for any ξ > 0, { log MFB (1 (n, ε, P) log ξ log ε Pr log pn Y X (Yn X n ) q Y n(y n ) log ξ }) Choose auxiliary output distribution q Y n carefully. Inspired by Hayashi (2009), we choose q Y n(y n )= S (n) N (y l,k ; 0, s l +N l ) N (y l,k ; 0, P l +N l ) s S (n) l,k l,k }{{}}{{} q (1) Y n (y n ) q (2) Y n (y n ) q (1) Y n (y n ): Unif. mixture of output dist. induced by input power types Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

45 Application of the Meta-Converse (PPV 10) By a relaxation of the meta-converse, for any ξ > 0, { log MFB (1 (n, ε, P) log ξ log ε Pr log pn Y X (Yn X n ) q Y n(y n ) log ξ }) Choose auxiliary output distribution q Y n carefully. Inspired by Hayashi (2009), we choose q Y n(y n )= S (n) N (y l,k ; 0, s l +N l ) N (y l,k ; 0, P l +N l ) s S (n) l,k l,k }{{}}{{} q (1) Y n (y n ) q (2) Y n (y n ) q (1) Y n (y n ): Unif. mixture of output dist. induced by input power types q (2) Y n (y n ): Capacity-achieving output dist. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

46 Almost-Optimal Types: Part I Define the set of almost-optimal power types { Π (n) := s S (n) s P 2 1 } n 1/6 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

47 Almost-Optimal Types: Part I Define the set of almost-optimal power types { Π (n) := s S (n) s P 2 1 } n 1/6 S (n) Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

48 Almost-Optimal Types: Part I Define the set of almost-optimal power types { Π (n) := s S (n) s P 2 1 } n 1/6 S (n) P 3 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

49 Almost-Optimal Types: Part I Define the set of almost-optimal power types { Π (n) := s S (n) s P 2 1 } n 1/6 Π (n) S (n) P 3 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

50 Almost-Optimal Types: Part II Define threshold log ξ := nc(p ) + ) n( V(P )Φ 1 (ε + τ) ( ) + log 2 S (n) Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

51 Almost-Optimal Types: Part II Define threshold log ξ := nc(p ) + ) n( V(P )Φ 1 (ε + τ) Contribution of information spectrum term on Π (n) is { } 1 n Pr U (P ) nv(p k Φ 1 (ε + τ) ) where k=1 ( ) + log 2 S (n) U (P ) k := d l=1 ( P l N l )Z 2 l,k + 2X l,kz l,k + P l 2(P l + N l ) k [1 : n] and φ(x n ) Π (n). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

52 Almost-Optimal Types: Part II Define threshold log ξ := nc(p ) + ) n( V(P )Φ 1 (ε + τ) Contribution of information spectrum term on Π (n) is { } 1 n Pr U (P ) nv(p k Φ 1 (ε + τ) ) where k=1 ( ) + log 2 S (n) U (P ) k := d l=1 ( P l N l )Z 2 l,k + 2X l,kz l,k + P l 2(P l + N l ) k [1 : n] and φ(x n ) Π (n). Because of feedback, U (P ) k is not independent across time! Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

53 Almost-Optimal Types: Part III Would like to approximate the nasty { U (P ) k k [1 : n]} with the independent and identically distributed random variables V (P ) k := d l=1 ( P l N l )Z 2 l,k + 2 P l Z l,k + P l 2(P l + N l ) k [1 : n] Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

54 Almost-Optimal Types: Part III Would like to approximate the nasty { U (P ) k k [1 : n]} with the independent and identically distributed random variables V (P ) k := d l=1 ( P l N l )Z 2 l,k + 2 P l Z l,k + P l 2(P l + N l ) k [1 : n] Due to our discretization procedure and the fact that φ(x n ) Π (n) lim E n [ for all t R. exp ( t n n k=1 ) ] U (P ) k = lim E n [ exp ( t n n k=1 ) ] V (P ) k Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

55 Almost-Optimal Types: Part IV Curtiss (aka Lévy s continuity) theorem: implies that lim E[exp(tX n)] = lim E[exp(tY n)], t R, n n lim Pr{X n a} = lim Pr{Y n a}, a R. n n Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

56 Almost-Optimal Types: Part IV Curtiss (aka Lévy s continuity) theorem: implies that Thus, lim E[exp(tX n)] = lim E[exp(tY n)], t R, n n lim Pr{X n a} = lim Pr{Y n a}, a R. n n lim Pr n { 1 nv(p ) n k=1 U (P ) k Φ 1 (ε + τ) } Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

57 Almost-Optimal Types: Part IV Curtiss (aka Lévy s continuity) theorem: implies that Thus, lim E[exp(tX n)] = lim E[exp(tY n)], t R, n n lim Pr{X n a} = lim Pr{Y n a}, a R. n n lim Pr n { = lim n Pr 1 nv(p ) { n k=1 1 nv(p ) U (P ) k Φ 1 (ε + τ) n k=1 } V (P ) k Φ 1 (ε + τ) } Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

58 Almost-Optimal Types: Part IV Curtiss (aka Lévy s continuity) theorem: implies that Thus, lim E[exp(tX n)] = lim E[exp(tY n)], t R, n n lim Pr{X n a} = lim Pr{Y n a}, a R. n n lim Pr n { = lim n Pr 1 nv(p ) { CLT = 1 (ε + τ). n k=1 1 nv(p ) U (P ) k Φ 1 (ε + τ) n k=1 } V (P ) k Φ 1 (ε + τ) } Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

59 Far-From-Optimal Types Contribution of information spectrum term on S (n) \ Π (n) is { } 1 n Pr U (s) k C(P ) C(s), s S (n) \ Π (n) n k=1 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

60 Far-From-Optimal Types Contribution of information spectrum term on S (n) \ Π (n) is { } 1 n Pr U (s) k C(P ) C(s), s S (n) \ Π (n) n Recall k=1 S (n) \ Π (n) = { s S (n) s P 2 > 1 } n 1/6 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

61 Far-From-Optimal Types Contribution of information spectrum term on S (n) \ Π (n) is { } 1 n Pr U (s) k C(P ) C(s), s S (n) \ Π (n) n Recall On this set, k=1 S (n) \ Π (n) = { s S (n) s P 2 > 1 } n 1/6 ( ) 1 C(P ) C(s) = Ω n 1/3. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

62 Far-From-Optimal Types Contribution of information spectrum term on S (n) \ Π (n) is { } 1 n Pr U (s) k C(P ) C(s), s S (n) \ Π (n) n Recall k=1 S (n) \ Π (n) = { s S (n) s P 2 > 1 } n 1/6 On this set, ( ) 1 C(P ) C(s) = Ω n 1/3. Thus, by a standard exponential Markov inequality argument { } 1 n max Pr U (s) s S (n) \Π (n) k C(P ) C(s) exp ( O(n 1/3 ) ). n k=1 Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

63 Far-From-Optimal Types Contribution of information spectrum term on S (n) \ Π (n) is { } 1 n Pr U (s) k C(P ) C(s), s S (n) \ Π (n) n Recall k=1 S (n) \ Π (n) = { s S (n) s P 2 > 1 } n 1/6 On this set, ( ) 1 C(P ) C(s) = Ω n 1/3. Thus, by a standard exponential Markov inequality argument { } 1 n max Pr U (s) s S (n) \Π (n) k C(P ) C(s) exp ( O(n 1/3 ) ). n k=1 Polynomially many power types. Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

64 Wrap-Up and Future Work For parallel Gaussian channels with feedback, log M FB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

65 Wrap-Up and Future Work For parallel Gaussian channels with feedback, log M FB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Careful quantization of power allocation vector into power types Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

66 Wrap-Up and Future Work For parallel Gaussian channels with feedback, log MFB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Careful quantization of power allocation vector into power types Bounding main information spectrum term using Curtiss theorem Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

67 Wrap-Up and Future Work For parallel Gaussian channels with feedback, log MFB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Careful quantization of power allocation vector into power types Bounding main information spectrum term using Curtiss theorem For third-order, need speed of convergence, i.e., sup E[exp(tX n )] E[exp(tY n )] f (n), t R for some f (n) = o(1), do we have sup Pr{X n a} Pr{Y n a} g(n), a R for some desirable g(n)? Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

68 Wrap-Up and Future Work For parallel Gaussian channels with feedback, log MFB (n, ε, P) = nc(p ) + nv(p )Φ 1 (ε) + o( n). Careful quantization of power allocation vector into power types Bounding main information spectrum term using Curtiss theorem For third-order, need speed of convergence, i.e., sup E[exp(tX n )] E[exp(tY n )] f (n), t R for some f (n) = o(1), do we have sup Pr{X n a} Pr{Y n a} g(n), a R for some desirable g(n)? Esséen smoothing lemma? Stein s method? Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

69 Full paper Oct 2017 issue of IEEE Transactions on Information Theory Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

70 Advertisement: Postdoc Positions in IT and ML at NUS Vincent Tan (NUS) Second-Order Parallel AWGN with Feedback ITW / 22

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo Classical communication over classical channels using non-classical correlation. Will Matthews IQC @ University of Waterloo 1 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

Lecture Notes 3 Convergence (Chapter 5)

Lecture Notes 3 Convergence (Chapter 5) Lecture Notes 3 Convergence (Chapter 5) 1 Convergence of Random Variables Let X 1, X 2,... be a sequence of random variables and let X be another random variable. Let F n denote the cdf of X n and let

More information

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014

More information

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Stein s Method and Characteristic Functions

Stein s Method and Characteristic Functions Stein s Method and Characteristic Functions Alexander Tikhomirov Komi Science Center of Ural Division of RAS, Syktyvkar, Russia; Singapore, NUS, 18-29 May 2015 Workshop New Directions in Stein s method

More information

Memory in Classical Information Theory: A Brief History

Memory in Classical Information Theory: A Brief History Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)

More information

Lossy Compression Coding Theorems for Arbitrary Sources

Lossy Compression Coding Theorems for Arbitrary Sources Lossy Compression Coding Theorems for Arbitrary Sources Ioannis Kontoyiannis U of Cambridge joint work with M. Madiman, M. Harrison, J. Zhang Beyond IID Workshop, Cambridge, UK July 23, 2018 Outline Motivation

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

PERCOLATION IN SELFISH SOCIAL NETWORKS

PERCOLATION IN SELFISH SOCIAL NETWORKS PERCOLATION IN SELFISH SOCIAL NETWORKS Charles Bordenave UC Berkeley joint work with David Aldous (UC Berkeley) PARADIGM OF SOCIAL NETWORKS OBJECTIVE - Understand better the dynamics of social behaviors

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

An Extended Fano s Inequality for the Finite Blocklength Coding

An Extended Fano s Inequality for the Finite Blocklength Coding An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.

More information

Section 27. The Central Limit Theorem. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University

Section 27. The Central Limit Theorem. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University Section 27 The Central Limit Theorem Po-Ning Chen, Professor Institute of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 3000, R.O.C. Identically distributed summands 27- Central

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff Transmitting samples over the Gaussian channel: energy-distortion tradeoff Victoria Kostina Yury Polyansiy Sergio Verdú California Institute of Technology Massachusetts Institute of Technology Princeton

More information

A General Formula for Compound Channel Capacity

A General Formula for Compound Channel Capacity A General Formula for Compound Channel Capacity Sergey Loyka, Charalambos D. Charalambous University of Ottawa, University of Cyprus ETH Zurich (May 2015), ISIT-15 1/32 Outline 1 Introduction 2 Channel

More information

Lecture 4: Codes based on Concatenation

Lecture 4: Codes based on Concatenation Lecture 4: Codes based on Concatenation Error-Correcting Codes (Spring 206) Rutgers University Swastik Kopparty Scribe: Aditya Potukuchi and Meng-Tsung Tsai Overview In the last lecture, we studied codes

More information

On the Error Exponents of ARQ Channels with Deadlines

On the Error Exponents of ARQ Channels with Deadlines On the Error Exponents of ARQ Channels with Deadlines Praveen Kumar Gopala, Young-Han Nam and Hesham El Gamal arxiv:cs/06006v [cs.it] 8 Oct 2006 March 22, 208 Abstract We consider communication over Automatic

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Converse bounds for private communication over quantum channels

Converse bounds for private communication over quantum channels Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898

More information

Learning Methods for Online Prediction Problems. Peter Bartlett Statistics and EECS UC Berkeley

Learning Methods for Online Prediction Problems. Peter Bartlett Statistics and EECS UC Berkeley Learning Methods for Online Prediction Problems Peter Bartlett Statistics and EECS UC Berkeley Course Synopsis A finite comparison class: A = {1,..., m}. Converting online to batch. Online convex optimization.

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

New Results on the Equality of Exact and Wyner Common Information Rates

New Results on the Equality of Exact and Wyner Common Information Rates 08 IEEE International Symposium on Information Theory (ISIT New Results on the Equality of Exact and Wyner Common Information Rates Badri N. Vellambi Australian National University Acton, ACT 0, Australia

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

variance of independent variables: sum of variances So chebyshev predicts won t stray beyond stdev.

variance of independent variables: sum of variances So chebyshev predicts won t stray beyond stdev. Announcements No class monday. Metric embedding seminar. Review expectation notion of high probability. Markov. Today: Book 4.1, 3.3, 4.2 Chebyshev. Remind variance, standard deviation. σ 2 = E[(X µ X

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y

More information

Economics 241B Review of Limit Theorems for Sequences of Random Variables

Economics 241B Review of Limit Theorems for Sequences of Random Variables Economics 241B Review of Limit Theorems for Sequences of Random Variables Convergence in Distribution The previous de nitions of convergence focus on the outcome sequences of a random variable. Convergence

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

Small Ball Probability, Arithmetic Structure and Random Matrices

Small Ball Probability, Arithmetic Structure and Random Matrices Small Ball Probability, Arithmetic Structure and Random Matrices Roman Vershynin University of California, Davis April 23, 2008 Distance Problems How far is a random vector X from a given subspace H in

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

Variable Rate Channel Capacity. Jie Ren 2013/4/26

Variable Rate Channel Capacity. Jie Ren 2013/4/26 Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons

More information

Expectation is linear. So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then,

Expectation is linear. So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then, Expectation is linear So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then, E(αX) = ω = ω (αx)(ω) Pr(ω) αx(ω) Pr(ω) = α ω X(ω) Pr(ω) = αe(x). Corollary. For α, β R, E(αX + βy ) = αe(x) + βe(y ).

More information

Simultaneous Information and Energy Transmission: A Finite Block-Length Analysis

Simultaneous Information and Energy Transmission: A Finite Block-Length Analysis Simultaneous Information and Energy Transmission: A Finite Block-Length Analysis Samir Perlaza, Ali Tajer, H. Vincent Poor To cite this version: Samir Perlaza, Ali Tajer, H. Vincent Poor. Simultaneous

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Jialing Liu liujl@iastate.edu Sekhar Tatikonda sekhar.tatikonda@yale.edu Nicola Elia nelia@iastate.edu Dept. of

More information

Lecture 6 Channel Coding over Continuous Channels

Lecture 6 Channel Coding over Continuous Channels Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have

More information

Weak quenched limiting distributions of a one-dimensional random walk in a random environment

Weak quenched limiting distributions of a one-dimensional random walk in a random environment Weak quenched limiting distributions of a one-dimensional random walk in a random environment Jonathon Peterson Cornell University Department of Mathematics Joint work with Gennady Samorodnitsky September

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Chapter 1 Elements of Information Theory for Networked Control Systems

Chapter 1 Elements of Information Theory for Networked Control Systems Chapter 1 Elements of Information Theory for Networked Control Systems Massimo Franceschetti and Paolo Minero 1.1 Introduction Next generation cyber-physical systems [35] will integrate computing, communication,

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

IN this paper, we consider the capacity of sticky channels, a

IN this paper, we consider the capacity of sticky channels, a 72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion

More information

Feedback Capacity of the First-Order Moving Average Gaussian Channel

Feedback Capacity of the First-Order Moving Average Gaussian Channel Feedback Capacity of the First-Order Moving Average Gaussian Channel Young-Han Kim* Information Systems Laboratory, Stanford University, Stanford, CA 94305, USA Email: yhk@stanford.edu Abstract The feedback

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

04. Random Variables: Concepts

04. Random Variables: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

A new converse in rate-distortion theory

A new converse in rate-distortion theory A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

A One-to-One Code and Its Anti-Redundancy

A One-to-One Code and Its Anti-Redundancy A One-to-One Code and Its Anti-Redundancy W. Szpankowski Department of Computer Science, Purdue University July 4, 2005 This research is supported by NSF, NSA and NIH. Outline of the Talk. Prefix Codes

More information

Research Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library

Research Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library Research Collection Conference Paper The relation between block length and reliability for a cascade of AWG links Authors: Subramanian, Ramanan Publication Date: 0 Permanent Link: https://doiorg/0399/ethz-a-00705046

More information

Tail Inequalities Randomized Algorithms. Sariel Har-Peled. December 20, 2002

Tail Inequalities Randomized Algorithms. Sariel Har-Peled. December 20, 2002 Tail Inequalities 497 - Randomized Algorithms Sariel Har-Peled December 0, 00 Wir mssen wissen, wir werden wissen (We must know, we shall know) David Hilbert 1 Tail Inequalities 1.1 The Chernoff Bound

More information

Stability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan

Stability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan Stability and Sensitivity of the Capacity in Continuous Channels Malcolm Egan Univ. Lyon, INSA Lyon, INRIA 2019 European School of Information Theory April 18, 2019 1 / 40 Capacity of Additive Noise Models

More information

u( x) = g( y) ds y ( 1 ) U solves u = 0 in U; u = 0 on U. ( 3)

u( x) = g( y) ds y ( 1 ) U solves u = 0 in U; u = 0 on U. ( 3) M ath 5 2 7 Fall 2 0 0 9 L ecture 4 ( S ep. 6, 2 0 0 9 ) Properties and Estimates of Laplace s and Poisson s Equations In our last lecture we derived the formulas for the solutions of Poisson s equation

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

The Central Limit Theorem: More of the Story

The Central Limit Theorem: More of the Story The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit

More information

Self-normalized Cramér-Type Large Deviations for Independent Random Variables

Self-normalized Cramér-Type Large Deviations for Independent Random Variables Self-normalized Cramér-Type Large Deviations for Independent Random Variables Qi-Man Shao National University of Singapore and University of Oregon qmshao@darkwing.uoregon.edu 1. Introduction Let X, X

More information

Lecture 11: Probability, Order Statistics and Sampling

Lecture 11: Probability, Order Statistics and Sampling 5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

Capacity Bounds on Timing Channels with Bounded Service Times

Capacity Bounds on Timing Channels with Bounded Service Times Capacity Bounds on Timing Channels with Bounded Service Times S. Sellke, C.-C. Wang, N. B. Shroff, and S. Bagchi School of Electrical and Computer Engineering Purdue University, West Lafayette, IN 47907

More information