Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Size: px
Start display at page:

Download "Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets"

Transcription

1 Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering, National University of Singapore Information Theory and Applications Workshop (Feb 2013) Full version: Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

2 Second-Order Asymptotics for Point-to-Point Channels Second-order asymptotics for the DMC and the AWGN channel are well-understood Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

3 Second-Order Asymptotics for Point-to-Point Channels Second-order asymptotics for the DMC and the AWGN channel are well-understood M (n, ε, S): maximum codebook size for n uses of the AWGN channel with SNR S and average error probability ε Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

4 Second-Order Asymptotics for Point-to-Point Channels Second-order asymptotics for the DMC and the AWGN channel are well-understood M (n, ε, S): maximum codebook size for n uses of the AWGN channel with SNR S and average error probability ε Hayashi (2009) and Polyanskiy-Poor-Verdú (2010) showed that log M (n, ε, S) = nc(s) + nv(s)φ 1 (ε) + o( n) nats, where the Gaussian capacity and Gaussian dispersion functions are defined as C(S) := 1 2 log(1 + S), V(S) := S(S + 2) 2(S + 1) 2 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

5 Second-Order Asymptotics for Point-to-Point Channels Second-order asymptotics for the DMC and the AWGN channel are well-understood M (n, ε, S): maximum codebook size for n uses of the AWGN channel with SNR S and average error probability ε Hayashi (2009) and Polyanskiy-Poor-Verdú (2010) showed that log M (n, ε, S) = nc(s) + nv(s)φ 1 (ε) + o( n) nats, where the Gaussian capacity and Gaussian dispersion functions are defined as C(S) := 1 2 log(1 + S), V(S) := S(S + 2) 2(S + 1) 2 Second-order coding rate = Largest coefficient of n term = V(S)Φ 1 (ε). Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

6 Second-Order Asymptotics in Networks Second-order asymptotics in networks has been more modest Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

7 Second-Order Asymptotics in Networks Second-order asymptotics in networks has been more modest Tan-Kosut (2014) and Nomura-Han (2012) Slepian-Wolf problem Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

8 Second-Order Asymptotics in Networks Second-order asymptotics in networks has been more modest Tan-Kosut (2014) and Nomura-Han (2012) Slepian-Wolf problem R 2 Let L(ε; R 1, R 2 ) be the set of all (L 1, L 2 ) R 2 such that there exists length-n codes of sizes (M 1,n, M 2,n ) and errors ε n such that lim sup n 1 (log M j,n nr n j ) L j, j = 1, 2, lim sup ε n ε. n θ = tan(l 2/L 1) H 2 H 2 1 θ (R 1, R 2 ) H 1 2 H 1 R 1 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

9 Second-Order Asymptotics in Networks Second-order asymptotics in networks has been more modest Tan-Kosut (2014) and Nomura-Han (2012) Slepian-Wolf problem R 2 Let L(ε; R 1, R 2 ) be the set of all (L 1, L 2 ) R 2 such that there exists length-n codes of sizes (M 1,n, M 2,n ) and errors ε n such that lim sup n 1 (log M j,n nr n j ) L j, j = 1, 2, lim sup ε n ε. n θ = tan(l 2/L 1) (L 1, L 2 ) L(ε; R 1, R 2) H 2 H 2 1 θ (R 1, R 2 ) H 1 2 H 1 R 1 implies exists ε-reliable codes with log M j,n nr j + n L j + o( n) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

10 Second-Order Asymptotics in Networks R 2 θ = tan(l 2/L 1) H 2 H 2 1 θ (R 1, R 2 ) H 1 2 H 1 R 1 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

11 Second-Order Asymptotics in Networks R 2 θ = tan(l 2/L 1) H 2 H 2 1 θ (R 1, R 2 ) H 1 2 H 1 R 1 where L(ε; H 1, H 2 1 ) = {(L 1, L 2 ) : Ψ(L 2, L 1 + L 2 ; V 2,12 ) 1 ε}. Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

12 Second-Order Asymptotics in Networks R 2 θ = tan(l 2/L 1) H 2 H 2 1 θ (R 1, R 2 ) H 1 2 H 1 R 1 where L(ε; H 1, H 2 1 ) = {(L 1, L 2 ) : Ψ(L 2, L 1 + L 2 ; V 2,12 ) 1 ε}. z2 z3 Ψ(z 2, z 3 ; V) := N (0, V) du, and ( [ ] ) V 2,12 := Cov log px2 X 1 log p X1 X 2 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

13 Second-Order Asymptotics in the MAC The next-easiest network problem is the MAC Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

14 Second-Order Asymptotics in the MAC The next-easiest network problem is the MAC Numerous attempts to characterize second-order asymptotics for the MAC including Tan-Kosut (2014), Huang-Moulin (2012), MolavianJazi-Laneman (2012), Haim-Erez-Kochman (2012), Scarlett-Martinez-Guillén i Fàbregas (2013) etc. Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

15 Second-Order Asymptotics in the MAC The next-easiest network problem is the MAC Numerous attempts to characterize second-order asymptotics for the MAC including Tan-Kosut (2014), Huang-Moulin (2012), MolavianJazi-Laneman (2012), Haim-Erez-Kochman (2012), Scarlett-Martinez-Guillén i Fàbregas (2013) etc. No tight local-dispersion characterization yet for the DM-MAC or Gaussian MAC Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

16 Second-Order Asymptotics in the MAC The next-easiest network problem is the MAC Numerous attempts to characterize second-order asymptotics for the MAC including Tan-Kosut (2014), Huang-Moulin (2012), MolavianJazi-Laneman (2012), Haim-Erez-Kochman (2012), Scarlett-Martinez-Guillén i Fàbregas (2013) etc. No tight local-dispersion characterization yet for the DM-MAC or Gaussian MAC Some problems (primarily the converse): 1 Difficulty in characterizing the boundary of the CR 2 Independent input distributions Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

17 Second-Order Asymptotics in the MAC The next-easiest network problem is the MAC Numerous attempts to characterize second-order asymptotics for the MAC including Tan-Kosut (2014), Huang-Moulin (2012), MolavianJazi-Laneman (2012), Haim-Erez-Kochman (2012), Scarlett-Martinez-Guillén i Fàbregas (2013) etc. No tight local-dispersion characterization yet for the DM-MAC or Gaussian MAC Some problems (primarily the converse): 1 Difficulty in characterizing the boundary of the CR 2 Independent input distributions We consider a MAC-like model that retains the main characteristics of MAC but skirts the problems above Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

18 Gaussian MAC with Degraded Message Sets Z N (0, I n n ) M 1 f 1,n X 1 Y + ϕ ( ˆM 1, ˆM 2 ) n X 2 M 2 f 2,n x j 2 2 ns j, j = 1, 2 Encoder 1 has access to both messages Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

19 Gaussian MAC with Degraded Message Sets Z N (0, I n n ) M 1 f 1,n X 1 Y + ϕ ( ˆM 1, ˆM 2 ) n X 2 M 2 f 2,n x j 2 2 ns j, j = 1, 2 Encoder 1 has access to both messages Capacity region is well known [Exercise 5.18(b), El Gamal and Kim (2012)]; achieved using superposition coding Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

20 Gaussian MAC with Degraded Message Sets R2 (nats/use) The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 R 1 C ( (1 ρ 2 )S 1 ) R 1 +R 2 C ( S 1 +S 2 +2ρ S 1 S 2 ) C(x) = 1 log(1 + x) R1 (nats/use) ρ [0, 1] parametrizes curved boundary and indicates the amount of correlation between users codewords. Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

21 Second-Order Asymptotics for Gaussian MAC w/ DMS (R 1, R 2 ): arbitrary point on the boundary of the capacity region Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

22 Second-Order Asymptotics for Gaussian MAC w/ DMS (R 1, R 2 ): arbitrary point on the boundary of the capacity region L(ε; R 1, R 2 ): set of all (L 1, L 2 ) R 2 such that there exists length-n codes of sizes (M 1,n, M 2,n ) and average errors ε n such that lim inf n 1 (log M j,n nr n j ) L j, j = 1, 2, lim sup ε n ε n and x j 2 2 ns j, for j = 1, 2 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

23 Second-Order Asymptotics for Gaussian MAC w/ DMS (R 1, R 2 ): arbitrary point on the boundary of the capacity region L(ε; R 1, R 2 ): set of all (L 1, L 2 ) R 2 such that there exists length-n codes of sizes (M 1,n, M 2,n ) and average errors ε n such that lim inf n 1 (log M j,n nr n j ) L j, j = 1, 2, lim sup ε n ε n and x j 2 2 ns j, for j = 1, 2 If (L 1, L 2 ) L(ε; R 1, R 2 ), there exists ε-reliable codes s.t. log M j,n nr j + n L j + o( n), j = 1, 2. Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

24 Second-Order Asymptotics for Gaussian MAC w/ DMS (R 1, R 2 ): arbitrary point on the boundary of the capacity region L(ε; R 1, R 2 ): set of all (L 1, L 2 ) R 2 such that there exists length-n codes of sizes (M 1,n, M 2,n ) and average errors ε n such that lim inf n 1 (log M j,n nr n j ) L j, j = 1, 2, lim sup ε n ε n and x j 2 2 ns j, for j = 1, 2 If (L 1, L 2 ) L(ε; R 1, R 2 ), there exists ε-reliable codes s.t. log M j,n nr j + n L j + o( n), j = 1, 2. Main Contribution: A complete characterization of L(ε; R 1, R 2 ) First complete characterization of second-order asymptotics for a channel-type network information theory problem Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

25 Some Basic Definitions Mutual informations [ ] [ I1 (ρ) C ( (1 ρ I(ρ) := = 2 ) ] )S 1 I 12 (ρ) C ( S 1 + S 2 + 2ρ ) S 1 S 2 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

26 Some Basic Definitions Mutual informations [ ] [ I1 (ρ) C ( (1 ρ I(ρ) := = 2 ) ] )S 1 I 12 (ρ) C ( S 1 + S 2 + 2ρ ) S 1 S 2 Derivative of mutual informations D(ρ) := ρ I(ρ) = [ S 1 ρ 1+S 1 (1 ρ 2 ) S1 S 2 ] 1+S 1 +S 2 +2ρ S 1 S 2 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

27 Some Basic Definitions Mutual informations [ ] [ I1 (ρ) C ( (1 ρ I(ρ) := = 2 ) ] )S 1 I 12 (ρ) C ( S 1 + S 2 + 2ρ ) S 1 S 2 Derivative of mutual informations D(ρ) := ρ I(ρ) = [ S 1 ρ 1+S 1 (1 ρ 2 ) S1 S 2 ] x(y+2) 2(x+1)(y+1) 1+S 1 +S 2 +2ρ S 1 S 2 Dispersions V(x, y) := and V(x) := V(x, x) [ ] V1 (ρ) V V(ρ) := 1,12 (ρ) V 1,12 (ρ) V 12,12 (ρ) where V 1 (ρ) := V ( (1 ρ 2 ) )S 1, V12,12 (ρ) := V ( S 1 + S 2 + 2ρ ) S 1 S 2 V 1,12 (ρ) := V ( (1 ρ 2 )S 1, S 1 + S 2 + 2ρ ) S 1 S 2 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

28 Generalization of Inverse CDF of a Gaussian For a positive semi-definite matrix V, Ψ(z 1, z 2, V) = z1 z2 N (0, V) du Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

29 Generalization of Inverse CDF of a Gaussian For a positive semi-definite matrix V, Ψ(z 1, z 2, V) = Given ε (0, 1), z1 z2 N (0, V) du Ψ 1 (V, ε) = {(z 1, z 2 ) : Ψ( z 1, z 2, V) 1 ε} ρ = ρ = z2 z ε = 0.01 ε = z ε = 0.01 ε = z1 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

30 The Main Result: Vertical Boundary Points on vertical boundary reduce to scalar dispersion as sum rate constraint is in error exponents regime [Haim-Erez-Kochman (2012)] L(ε; R 1, R 2) = {(L 1, L 2 ) : L 1 V 1 (0)Φ 1 (ε)} The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/ R2 (nats/use) R1 (nats/use) (R 1, R 2 ) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

31 The Main Result: Vertical Boundary Points on vertical boundary reduce to scalar dispersion as sum rate constraint is in error exponents regime [Haim-Erez-Kochman (2012)] L(ε; R 1, R 2) = {(L 1, L 2 ) : L 1 V 1 (0)Φ 1 (ε)} The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 Following expansion holds for cardinality of first codebook R2 (nats/use) R1 (nats/use) (R 1, R 2 ) log M 1,n ni 1 (0) + nv 1 (0)Φ 1 (ε) Far from sum rate constraint 1 lim n n log(m 1,nM 2,n ) < I 12 (0) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

32 The Main Result: Curved Boundary Radically different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 R2 (nats/use) (R 1, R 2 ) R1 (nats/use) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

33 The Main Result: Curved Boundary Radically different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 D(ρ) doesn t appear for SW R2 (nats/use) (R 1, R 2 ) R1 (nats/use) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

34 The Main Result: Curved Boundary Radically different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R R2 (nats/use) The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 (R 1, R 2 ) D(ρ) doesn t appear for SW Ψ 1 (V(ρ), ε): corresponds to only using N (0, Σ(ρ)) where [ S Σ(ρ) = 1 ρ ] S 1 S 2 ρ S 1 S 2 S R1 (nats/use) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

35 The Main Result: Curved Boundary Radically different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R R2 (nats/use) The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 (R 1, R 2 ) D(ρ) doesn t appear for SW Ψ 1 (V(ρ), ε): corresponds to only using N (0, Σ(ρ)) where [ S Σ(ρ) = 1 ρ ] S 1 S 2 ρ S 1 S 2 S 2 Non-empty regions in CR not in trapezium achievable by N (0, Σ(ρ)) R1 (nats/use) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

36 The Main Result: Curved Boundary Radically different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R R2 (nats/use) The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 (R 1, R 2 ) R1 (nats/use) D(ρ) doesn t appear for SW Ψ 1 (V(ρ), ε): corresponds to only using N (0, Σ(ρ)) where [ S Σ(ρ) = 1 ρ ] S 1 S 2 ρ S 1 S 2 S 2 Non-empty regions in CR not in trapezium achievable by N (0, Σ(ρ)) Use N (0, Σ(ρ n )) and ρ n = ρ + β/ n Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

37 Illustration of Second-Order Coding Rates Ψ 1 (V,ε) L(ε;R 1,R 2) L2(nats/ use) S 1 = S 2 = 1 and ρ = L 1 (nats/ use) Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

38 Illustration of Second-Order Coding Rates Ψ 1 (V,ε) L(ε;R 1,R 2) L2(nats/ use) S 1 = S 2 = 1 and ρ = L 1 (nats/ use) Second-order rates achieved using a single input distribution N (0, Σ(ρ)) is not optimal Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

39 Illustration of Second-Order Coding Rates Ψ 1 (V,ε) L(ε;R 1,R 2) L2(nats/ use) S 1 = S 2 = 1 and ρ = L 1 (nats/ use) Second-order rates achieved using a single input distribution N (0, Σ(ρ)) is not optimal The optimal second-order coding rate region is a half-space Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

40 Main Ideas in Converse Proof: Part I By a standard n n + 1 argument [Shannon (1959)], we may consider codes with equal power constraints x j 2 2 = ns j, j = 1, 2. Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

41 Main Ideas in Converse Proof: Part I By a standard n n + 1 argument [Shannon (1959)], we may consider codes with equal power constraints x j 2 2 = ns j, j = 1, 2. Reduction from average to maximal error probability via expurgation of polynomially many codeword pairs Not possible for standard MAC [Dueck (1978)] Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

42 Main Ideas in Converse Proof: Part I By a standard n n + 1 argument [Shannon (1959)], we may consider codes with equal power constraints x j 2 2 = ns j, j = 1, 2. Reduction from average to maximal error probability via expurgation of polynomially many codeword pairs Not possible for standard MAC [Dueck (1978)] Reduction to constant correlation type classes { x 1, x 2 ( k 1 T n (k) = (x 1, x 2 ) : x 1 2 x 2 2 n, k ] }, k = 1, 2,..., n. n Without too much loss in rate Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

43 Main Ideas in Converse Proof: Part II Verdú-Han-type converse: For any γ > 0 and any (Q Y X2, Q Y ), have the following non-asymptotic converse bound ( ε n Pr j 1 (X 1, X 2, Y) 1 n log M 1,n γ or j 12 (X 1, X 2, Y) 1 ) n log(m 1,nM 2,n ) γ 2e nγ where j 1 (x 1, x 2, y)= 1 n log Wn (y x 1,x 2 ) Q Y X2 (y x 2 ) and j 12(x 1, x 2, y)= 1 n log Wn (y x 1,x 2 ) Q Y (y). Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

44 Main Ideas in Converse Proof: Part II Verdú-Han-type converse: For any γ > 0 and any (Q Y X2, Q Y ), have the following non-asymptotic converse bound ( ε n Pr j 1 (X 1, X 2, Y) 1 n log M 1,n γ or j 12 (X 1, X 2, Y) 1 ) n log(m 1,nM 2,n ) γ 2e nγ where j 1 (x 1, x 2, y)= 1 n log Wn (y x 1,x 2 ) Q Y X2 (y x 2 ) and j 12(x 1, x 2, y)= 1 n log Wn (y x 1,x 2 ) Q Y (y). Let j = [j 1, j 12 ] T. Choose (Q Y X2, Q Y ) and for (x 1, x 2 ) T n (k), E [ j(x 1, x 2, Y) ] I(ρ) and Cov [ n j(x 1, x 2, Y) ] V(ρ) where k 1 n ρ k n Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

45 Main Ideas in Converse Proof: Part III By evaluating Verdú-Han using multivariate Berry-Esseen, there exists a sequence {ρ n } n 1 [0, 1] satisfying ( ) 1 ρ n = ρ + O n such that [ 1 log M 1,n ] n log(m 1,n M 2,n ) I(ρ n ) + Ψ 1 (V(ρ n ), ε) n ( ) 1 + o n 1 Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

46 Main Ideas in Converse Proof: Part III By evaluating Verdú-Han using multivariate Berry-Esseen, there exists a sequence {ρ n } n 1 [0, 1] satisfying ( ) 1 ρ n = ρ + O n such that [ 1 log M 1,n ] n log(m 1,n M 2,n ) I(ρ n ) + Ψ 1 (V(ρ n ), ε) n ( ) 1 + o n 1 By a Taylor expansion of I(ρ n ) around I(ρ) I(ρ n ) I(ρ) + (ρ n ρ)d(ρ) and the Bolzano-Weierstrass theorem, we establish the converse for a point on the curved boundary Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

47 Conclusion and Perspectives Why is this possible vis-à-vis second-order analysis for DM-MAC? Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

48 Conclusion and Perspectives Why is this possible vis-à-vis second-order analysis for DM-MAC? Gaussianity: Allows us to identify the boundary of the CR and time-sharing not necessary Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

49 Conclusion and Perspectives Why is this possible vis-à-vis second-order analysis for DM-MAC? Gaussianity: Allows us to identify the boundary of the CR and time-sharing not necessary Degraded Message Sets: Codeword pairs do not have to be independent for the strong converse Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

50 Conclusion and Perspectives Why is this possible vis-à-vis second-order analysis for DM-MAC? Gaussianity: Allows us to identify the boundary of the CR and time-sharing not necessary Degraded Message Sets: Codeword pairs do not have to be independent for the strong converse Key distinction relative to existing second-order works: D(ρ) = ρ I(ρ) Gaussian broadcast channel is similar to this problem Techniques may carry over Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

51 Conclusion and Perspectives Why is this possible vis-à-vis second-order analysis for DM-MAC? Gaussianity: Allows us to identify the boundary of the CR and time-sharing not necessary Degraded Message Sets: Codeword pairs do not have to be independent for the strong converse Key distinction relative to existing second-order works: D(ρ) = ρ I(ρ) Gaussian broadcast channel is similar to this problem Techniques may carry over Full version: Vincent Tan (ECE-NUS) Gaussian MAC with DMS ITA / 17

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National

More information

Reliable Communication Under Mismatched Decoding

Reliable Communication Under Mismatched Decoding Reliable Communication Under Mismatched Decoding Jonathan Scarlett Department of Engineering University of Cambridge Supervisor: Albert Guillén i Fàbregas This dissertation is submitted for the degree

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

Quantum Achievability Proof via Collision Relative Entropy

Quantum Achievability Proof via Collision Relative Entropy Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Advanced Topics in Information Theory

Advanced Topics in Information Theory Advanced Topics in Information Theory Lecture Notes Stefan M. Moser c Copyright Stefan M. Moser Signal and Information Processing Lab ETH Zürich Zurich, Switzerland Institute of Communications Engineering

More information

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo Classical communication over classical channels using non-classical correlation. Will Matthews IQC @ University of Waterloo 1 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet

More information

A new converse in rate-distortion theory

A new converse in rate-distortion theory A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Optimality of Treating Interference as Noise in Competitive Scenarios On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Capacity Theorems for Relay Channels

Capacity Theorems for Relay Channels Capacity Theorems for Relay Channels Abbas El Gamal Department of Electrical Engineering Stanford University April, 2006 MSRI-06 Relay Channel Discrete-memoryless relay channel [vm 7] Relay Encoder Y n

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Hybrid Coding. Young-Han Kim, UCSD. Living Information Theory Workshop Happy<70.42> th birthday,professorberger!

Hybrid Coding. Young-Han Kim, UCSD. Living Information Theory Workshop Happy<70.42> th birthday,professorberger! Hybrid Coding Young-Han Kim, UCSD Living Information Theory Workshop Happy th birthday,professorberger! 1/21 Hybrid Coding Young-Han Kim, UCSD Living Information Theory Workshop Happy th

More information

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

Information Theory Meets Game Theory on The Interference Channel

Information Theory Meets Game Theory on The Interference Channel Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University

More information

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Submitted: December, 5 Abstract In modern communication systems,

More information

Physical-Layer MIMO Relaying

Physical-Layer MIMO Relaying Model Gaussian SISO MIMO Gauss.-BC General. Physical-Layer MIMO Relaying Anatoly Khina, Tel Aviv University Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv University August 5, 2011 Model Gaussian

More information

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory

More information

THE mismatched decoding problem [1] [3] seeks to characterize

THE mismatched decoding problem [1] [3] seeks to characterize IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 64, NO. 4, APRIL 08 53 Mismatched Multi-Letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett, Member, IEEE, Alfonso Martinez, Senior

More information

Relay Networks With Delays

Relay Networks With Delays Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

An Extended Fano s Inequality for the Finite Blocklength Coding

An Extended Fano s Inequality for the Finite Blocklength Coding An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Converse bounds for private communication over quantum channels

Converse bounds for private communication over quantum channels Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

Incremental Coding over MIMO Channels

Incremental Coding over MIMO Channels Model Rateless SISO MIMO Applications Summary Incremental Coding over MIMO Channels Anatoly Khina, Tel Aviv University Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv University Gregory W. Wornell,

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5 UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).

More information

Rematch and Forward: Joint Source-Channel Coding for Communications

Rematch and Forward: Joint Source-Channel Coding for Communications Background ρ = 1 Colored Problem Extensions Rematch and Forward: Joint Source-Channel Coding for Communications Anatoly Khina Joint work with: Yuval Kochman, Uri Erez, Ram Zamir Dept. EE - Systems, Tel

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Computing sum of sources over an arbitrary multiple access channel

Computing sum of sources over an arbitrary multiple access channel Computing sum of sources over an arbitrary multiple access channel Arun Padakandla University of Michigan Ann Arbor, MI 48109, USA Email: arunpr@umich.edu S. Sandeep Pradhan University of Michigan Ann

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

On Achievable Rates for Channels with. Mismatched Decoding

On Achievable Rates for Channels with. Mismatched Decoding On Achievable Rates for Channels with 1 Mismatched Decoding Anelia Somekh-Baruch arxiv:13050547v1 [csit] 2 May 2013 Abstract The problem of mismatched decoding for discrete memoryless channels is addressed

More information

Capacity of AWGN channels

Capacity of AWGN channels Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that

More information

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007 Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT ECE 559 Presentation Hoa Pham Dec 3, 2007 Introduction MIMO systems provide two types of gains Diversity Gain: each path from a transmitter

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Optimal Power Control in Decentralized Gaussian Multiple Access Channels

Optimal Power Control in Decentralized Gaussian Multiple Access Channels 1 Optimal Power Control in Decentralized Gaussian Multiple Access Channels Kamal Singh Department of Electrical Engineering Indian Institute of Technology Bombay. arxiv:1711.08272v1 [eess.sp] 21 Nov 2017

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Variable Rate Channel Capacity. Jie Ren 2013/4/26

Variable Rate Channel Capacity. Jie Ren 2013/4/26 Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

A General Formula for Compound Channel Capacity

A General Formula for Compound Channel Capacity A General Formula for Compound Channel Capacity Sergey Loyka, Charalambos D. Charalambous University of Ottawa, University of Cyprus ETH Zurich (May 2015), ISIT-15 1/32 Outline 1 Introduction 2 Channel

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Recent Results on Input-Constrained Erasure Channels

Recent Results on Input-Constrained Erasure Channels Recent Results on Input-Constrained Erasure Channels A Case Study for Markov Approximation July, 2017@NUS Memoryless Channels Memoryless Channels Channel transitions are characterized by time-invariant

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

The Role of Directed Information in Network Capacity

The Role of Directed Information in Network Capacity The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering

More information

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Simultaneous Information and Energy Transmission: A Finite Block-Length Analysis

Simultaneous Information and Energy Transmission: A Finite Block-Length Analysis Simultaneous Information and Energy Transmission: A Finite Block-Length Analysis Samir Perlaza, Ali Tajer, H. Vincent Poor To cite this version: Samir Perlaza, Ali Tajer, H. Vincent Poor. Simultaneous

More information