Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
|
|
- Morgan Bailey
- 5 years ago
- Views:
Transcription
1 Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
2 Acknowledgements This is joint work with 1 Sy-Quoc Le and Mehul Motani (NUS) 2 Jon Scarlett (Cambridge, now at EPFL) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
3 Outline 1 Motivation, Background and History Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
4 Outline 1 Motivation, Background and History 2 Gaussian Interference Channel with Very Strong Interference Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
5 Outline 1 Motivation, Background and History 2 Gaussian Interference Channel with Very Strong Interference 3 Gaussian MAC with Degraded Message Sets Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
6 Outline 1 Motivation, Background and History 2 Gaussian Interference Channel with Very Strong Interference 3 Gaussian MAC with Degraded Message Sets 4 Conclusion Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
7 Outline 1 Motivation, Background and History 2 Gaussian Interference Channel with Very Strong Interference 3 Gaussian MAC with Degraded Message Sets 4 Conclusion Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
8 Transmission of Information Shannon s Figure 1 INFORMATION SOURCE TRANSMITTER RECEIVER DESTINATION SIGNAL RECEIVED SIGNAL MESSAGE MESSAGE NOISE SOURCE Shannon abstracted away information meaning, semantics treat all data equally Shannon s bits as a universal Figure currency 1 crucial abstraction for modern communication and computing systems Also relaxed computation and delay constraints to discover a fundamental limit: capacity, providing a goal-post to work toward Information theory Finding fundamental limits for reliable information transmission Saturday, June 11, 2011 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
9 Transmission of Information Shannon s Figure 1 INFORMATION SOURCE TRANSMITTER RECEIVER DESTINATION SIGNAL RECEIVED SIGNAL MESSAGE MESSAGE NOISE SOURCE Shannon abstracted away information meaning, semantics treat all data equally Shannon s bits as a universal Figure currency 1 crucial abstraction for modern communication and computing systems Also relaxed computation and delay constraints to discover a fundamental limit: capacity, providing a goal-post to work toward Information theory Finding fundamental limits for reliable information transmission Saturday, June 11, 2011 Channel coding: Concerned with the maximum rate of communication in bits/channel use Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
10 Channel Coding (One-Shot) M X Y M e W d A code is an triple C = {M, e, d} where M is the message set and b(e(m)) S for some cost function b( ) and cost S Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
11 Channel Coding (One-Shot) M X Y M e W d A code is an triple C = {M, e, d} where M is the message set and b(e(m)) S for some cost function b( ) and cost S The average error probability p err (C) is [ ] p err (C) := Pr M M where M is uniform on M Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
12 Channel Coding (One-Shot) M X Y M e W d A code is an triple C = {M, e, d} where M is the message set and b(e(m)) S for some cost function b( ) and cost S The average error probability p err (C) is [ ] p err (C) := Pr M M where M is uniform on M A non-asymptotic fundamental limit can be defined as M (W, ε, S) := sup { m N C s.t. m = M, p err (C) ε } Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
13 Channel Coding (n-shot) for AWGN M X n Y e W n n M d Consider n independent uses of an additive white Gaussian channel W n Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
14 Channel Coding (n-shot) for AWGN M X n Y e W n n M d Consider n independent uses of an additive white Gaussian channel W n For vectors x = (x 1,..., x n ) R n and y := (y 1,..., y n ) R n, with the channel law is W n (y x) = i=1 x 2 2 ns, n 1 exp ( (y i x i ) 2 ) 2π 2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
15 Channel Coding (n-shot) for AWGN M X n Y e W n n M d Consider n independent uses of an additive white Gaussian channel W n For vectors x = (x 1,..., x n ) R n and y := (y 1,..., y n ) R n, with the channel law is W n (y x) = i=1 x 2 2 ns, n 1 exp ( (y i x i ) 2 ) 2π 2 Non-asymptotic fundamental limit for n uses of W M (W n, ε, S) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
16 Single-User Asymptotic Evaluation Theorem (Hayashi (2009), Polyanskiy-Poor-Verdú (2010), Tan-Tomamichel (2014)) For every ε (0, 1), we have log M (W n, ε, S) = nc(s) + nv(s)φ 1 (ε) + 1 log n + O(1) 2 where C(S) and V(S) are the capacity and dispersion defined as C(S) = 1 2 log(1 + S), V(S) = S(S + 2) (S + 1) 2 log2 e Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
17 Single-User Asymptotic Evaluation Theorem (Hayashi (2009), Polyanskiy-Poor-Verdú (2010), Tan-Tomamichel (2014)) For every ε (0, 1), we have log M (W n, ε, S) = nc(s) + nv(s)φ 1 (ε) + 1 log n + O(1) 2 where C(S) and V(S) are the capacity and dispersion defined as C(S) = 1 2 log(1 + S), V(S) = S(S + 2) (S + 1) 2 log2 e M. Hayashi Polyanskiy-Poor-Verdú M. Tomamichel Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
18 Single-User Asymptotic Evaluation: Interpretation R (W n, ε, S) := log M (W n, ε, S) n V(S) ( log n ) = C(S) + n Φ 1 (ε) +O }{{} n Gaussian approximation Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
19 Single-User Asymptotic Evaluation: Interpretation R (W n, ε, S) := log M (W n, ε, S) n V(S) ( log n ) = C(S) + n Φ 1 (ε) +O }{{} n Gaussian approximation Interpretation: The backoff from C(S) at finite blocklength n and tolerable error probability is approximately V(S) Φ 1 (1 ε) n Small ε implies large backoff Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
20 Single-User Asymptotic Evaluation: Interpretation R (W n, ε, S) := log M (W n, ε, S) n V(S) ( log n ) = C(S) + n Φ 1 (ε) +O }{{} n Gaussian approximation Interpretation: The backoff from C(S) at finite blocklength n and tolerable error probability is approximately V(S) Φ 1 (1 ε) n Small ε implies large backoff Can compare to actual finite blocklength bounds Gaussian approximation is good for some n and ε Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
21 Background: Shannon s Channel Coding Theorem Shannon s noisy channel coding theorem and Shannon s (1959), Yoshihara s (1964) and Wolfowitz s (1978) strong converse state that Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
22 Background: Shannon s Channel Coding Theorem Shannon s noisy channel coding theorem and Shannon s (1959), Yoshihara s (1964) and Wolfowitz s (1978) strong converse state that Theorem (Shannon (1949), Shannon (1959)) 1 lim n n log M (W n, ε, S) = C(S), ε (0, 1) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
23 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε, S) = C(S) bits/channel use Channel coding theorem for AWGN channels is independent of ε (0, 1) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
24 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε, S) = C(S) bits/channel use Channel coding theorem for AWGN channels is independent of ε (0, 1) 1 lim p err(c n ) n 0 C(S) R Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
25 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε, S) = C(S) bits/channel use Channel coding theorem for AWGN channels is independent of ε (0, 1) 1 lim p err(c n ) n 0 C(S) R Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
26 Background: Shannon s Channel Coding Theorem 1 lim n n log M (W n, ε, S) = C(S) bits/channel use Channel coding theorem for AWGN channels is independent of ε (0, 1) 1 lim p err(c n ) n 0 C(S) R Phase transition at capacity Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
27 Background: Second-Order Coding Rates What happens at capacity? Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
28 Background: Second-Order Coding Rates What happens at capacity? More precisely, what happens when log M n nc(s) + L n for some L R? Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
29 Background: Second-Order Coding Rates What happens at capacity? More precisely, what happens when log M n nc(s) + L n for some L R? Here L is known as the second-order coding rate of the code Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
30 Background: Second-Order Coding Rates What happens at capacity? More precisely, what happens when log M n nc(s) + L n for some L R? Here L is known as the second-order coding rate of the code Note that L can be negative (cf. Hayashi (2008), Hayashi (2009)) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
31 Background: Second-Order Coding Rates Assume rate of the code satisfies 1 n log M n = C(S) + L ( 1 ) + o n n Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
32 Background: Second-Order Coding Rates Assume rate of the code satisfies 1 n log M n = C(S) + L ( 1 ) + o n n 1 lim n perr(cn) L Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
33 Background: Second-Order Coding Rates Assume rate of the code satisfies 1 n log M n = C(S) + L ( 1 ) + o n n 1 lim n perr(cn) 0.5 p err (C n ) = Φ( L V(S) ) + o(1) 0 L Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
34 Background: Second-Order Coding Rates Assume rate of the code satisfies 1 n log M n = C(S) + L ( 1 ) + o n n 1 lim n perr(cn) 0.5 p err (C n ) = Φ( L V(S) ) + o(1) 0 L For an error probability ε, the optimum second-order coding rate is L (ε) := V(S)Φ 1 (ε) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
35 Network Information Theory Network Information Theory by A. El Gamal and Y.-H. Kim Many problems unsolved Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
36 Network Information Theory Network Information Theory by A. El Gamal and Y.-H. Kim Many problems unsolved My agenda is to understand second-order behavior of solved NIT problems Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
37 Network Information Theory Network Information Theory by A. El Gamal and Y.-H. Kim Many problems unsolved My agenda is to understand second-order behavior of solved NIT problems Gain new insights on second-order optimal coding schemes Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
38 Network Information Theory Network Information Theory by A. El Gamal and Y.-H. Kim Many problems unsolved My agenda is to understand second-order behavior of solved NIT problems Gain new insights on second-order optimal coding schemes We study two simple NIT problems in the rest of the talk Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
39 Outline 1 Motivation, Background and History 2 Gaussian Interference Channel with Very Strong Interference 3 Gaussian MAC with Degraded Message Sets 4 Conclusion Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
40 Gaussian Interference Channel M 1 f 1 X1 n Y n 1 ϕ 1 ˆM 1 W n M 2 f 2 X2 n Y n 2 ϕ 2 ˆM 2 Two-sender two-receiver GIC is given as Y 1i = g 11 X 1i + g 12 X 2i + Z 1i, Y 2i = g 21 X 1i + g 22 X 2i + Z 2i Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
41 Gaussian Interference Channel M 1 f 1 X1 n Y n 1 ϕ 1 ˆM 1 W n M 2 f 2 X2 n Y n 2 ϕ 2 ˆM 2 Two-sender two-receiver GIC is given as Y 1i = g 11 X 1i + g 12 X 2i + Z 1i, Y 2i = g 21 X 1i + g 22 X 2i + Z 2i Channel inputs are power limited n Xji 2 ns j, j = 1, 2 i=1 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
42 Gaussian Interference Channel M 1 f 1 X1 n Y n 1 ϕ 1 ˆM 1 W n M 2 f 2 X2 n Y n 2 ϕ 2 ˆM 2 Two-sender two-receiver GIC is given as Y 1i = g 11 X 1i + g 12 X 2i + Z 1i, Y 2i = g 21 X 1i + g 22 X 2i + Z 2i Channel inputs are power limited n Xji 2 ns j, j = 1, 2 i=1 Capacity region is an open problem in NIT Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
43 Very Strong Inference Define the signal-to-noise ratios as snr 1 = g 2 11S 1, snr 2 = g 2 22S 2 Define the interference-to-noise ratios as inr 1 = g 2 12S 2, inr 2 = g 2 21S 1 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
44 Very Strong Inference Define the signal-to-noise ratios as snr 1 = g 2 11S 1, snr 2 = g 2 22S 2 Define the interference-to-noise ratios as inr 1 = g 2 12S 2, inr 2 = g 2 21S 1 A GIC is said to have strictly very strong interference (SVSI) if Equivalently snr 1 < inr snr 2, snr 2 < inr snr 1 C(snr 1 ) + C(snr 2 ) < min{c(snr 1 + inr 1 ), C(snr 2 + inr 2 )} Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
45 Very Strong Inference Define the signal-to-noise ratios as snr 1 = g 2 11S 1, snr 2 = g 2 22S 2 Define the interference-to-noise ratios as inr 1 = g 2 12S 2, inr 2 = g 2 21S 1 A GIC is said to have strictly very strong interference (SVSI) if Equivalently snr 1 < inr snr 2, snr 2 < inr snr 1 C(snr 1 ) + C(snr 2 ) < min{c(snr 1 + inr 1 ), C(snr 2 + inr 2 )} Intuition: Receiver 1 decodes interference M 2, then uses that to decode intended message M 1 (and vice versa) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
46 Capacity Region for GICs with SVSI R 2 C 2 (ii) (iii) (i) C 1 R 1 A. Carleial Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
47 Capacity Region for GICs with SVSI R 2 C 2 (ii) (iii) (i) C 1 R 1 A. Carleial Carleial (1975) showed that the capacity region is R 1 C 1 := C(snr 1 ) R 2 C 2 := C(snr 2 ) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
48 Capacity Region for GICs with SVSI R 2 C 2 (ii) (iii) (i) C 1 R 1 A. Carleial Carleial (1975) showed that the capacity region is R 1 C 1 := C(snr 1 ) R 2 C 2 := C(snr 2 ) Examine deviations of order 1 n away from the boundary Three distinct regions. Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
49 Second-Order Coding Rate Region (L 1, L 2 ) is an (ε, R 1, R 2 )-achievable second-order coding rate pair if there exists a sequence of (n, M 1n, M 2n, ε n )-codes such that the code sizes M jn satisfy lim inf n 1 n (log M jn nr j ) L j, j = 1, 2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
50 Second-Order Coding Rate Region (L 1, L 2 ) is an (ε, R 1, R 2 )-achievable second-order coding rate pair if there exists a sequence of (n, M 1n, M 2n, ε n )-codes such that the code sizes M jn satisfy lim inf n 1 n (log M jn nr j ) L j, j = 1, 2 and the average error probabilities ε n satisfy lim sup ε n ε n Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
51 Second-Order Coding Rate Region (L 1, L 2 ) is an (ε, R 1, R 2 )-achievable second-order coding rate pair if there exists a sequence of (n, M 1n, M 2n, ε n )-codes such that the code sizes M jn satisfy lim inf n 1 n (log M jn nr j ) L j, j = 1, 2 and the average error probabilities ε n satisfy lim sup ε n ε n If (L 1, L 2 ) is (ε, R 1, R 2 )-achievable, then there exists a sequence of codes such that log M jn nr j + nl j + o( n) and ε n ε + o(1). Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
52 Optimum Second-Order Coding Rate Region The set of all (ε, R 1, R 2 )-achievable second-order coding rate pairs is called the (ε, R 1, R 2 )-optimum second-order coding rate region L(ε; R 1, R 2) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
53 Optimum Second-Order Coding Rate Region The set of all (ε, R 1, R 2 )-achievable second-order coding rate pairs is called the (ε, R 1, R 2 )-optimum second-order coding rate region L(ε; R 1, R 2) Note: In the single-user case, L (ε) := sup L(ε; R 1 = C 1 ) = V(S 1 )Φ 1 (ε) This follows from Hayashi s work (2009) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
54 Characterize L(ε; R 1, R 2 ) This is joint work with S.-Q. Le (NUS) M. Motani (NUS) We want to characterize L(ε; R 1, R 2 ) for all points (R 1, R 2 ) on the boundary of the capacity region Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
55 Characterize L(ε; R 1, R 2 ) This is joint work with S.-Q. Le (NUS) M. Motani (NUS) We want to characterize L(ε; R 1, R 2 ) for all points (R 1, R 2 ) on the boundary of the capacity region For all (R 1, R 2 ) in the interior of the capacity region L(ε; R 1, R 2) = R 2 For all (R 1, R 2 ) in the exterior of the capacity region implying the strong converse L(ε; R 1, R 2) = Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
56 Main Result: Vertical Boundary R 2 C 2 (i) C 1 R 1 In case (i), we are far from the horizontal boundary R 2 < C 2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
57 Main Result: Vertical Boundary R 2 C 2 (i) C 1 R 1 In case (i), we are far from the horizontal boundary R 2 < C 2 Error event for user 2 vanishes (large deviations) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
58 Main Result: Vertical Boundary R 2 C 2 (i) C 1 R 1 In case (i), we are far from the horizontal boundary R 2 < C 2 Error event for user 2 vanishes (large deviations) Hence second-order asymptotics only pertains to user 1 L(ε; R 1, R 2) = { (L 1, L 2 ) : L 1 V 1 Φ 1 (ε) }. Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
59 Main Result: Corner Point R 2 C 2 (iii) C 1 R 1 In case (iii), we are operating near both boundaries Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
60 Main Result: Corner Point R 2 C 2 (iii) C 1 R 1 In case (iii), we are operating near both boundaries So both constraints are active and we see L 1 and L 2 in the optimum second-order coding rate region is { ( L(ε; R 1, R 2) = (L 1, L 2 ) : Φ L ( 1 )Φ L ) } 2 1 ε. V1 V2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
61 Heuristic Derivation of Case (iii) Let G j := { ˆM j = ˆM} be the event that message j = 1, 2 is decoded correctly Pr(G 1 G 2 ) 1 ε Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
62 Heuristic Derivation of Case (iii) Let G j := { ˆM j = ˆM} be the event that message j = 1, 2 is decoded correctly Pr(G 1 G 2 ) 1 ε Assuming independence (which does not hold generally), Pr(G 1 ) Pr(G 2 ) 1 ε Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
63 Heuristic Derivation of Case (iii) Let G j := { ˆM j = ˆM} be the event that message j = 1, 2 is decoded correctly Pr(G 1 G 2 ) 1 ε Assuming independence (which does not hold generally), Pr(G 1 ) Pr(G 2 ) 1 ε But we know from the single-user result that the error probability ( Pr(Gj c Lj ) ( ) Φ = Pr(G j ) Φ L j ). Vj Vj So the set of all (ε, C 1, C 2 )-achievable second-order coding rates is ( Φ L ( 1 )Φ L ) 2 1 ε V1 V2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
64 Illustration of Case (iii) L1 (nats/ use) V 1 = V 2 = 2 V 1 = V 2 = L 1 (nats/ use) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
65 Intuition Gleaned from GIC with SVSI Carleial (1975) mentioned that Very strong interference is as innocuous as no interference at all Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
66 Intuition Gleaned from GIC with SVSI Carleial (1975) mentioned that Very strong interference is as innocuous as no interference at all We show that SVSI is innocuous in the sense that the capacities C j and dispersions V j are not affected Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
67 Intuition Gleaned from GIC with SVSI Carleial (1975) mentioned that Very strong interference is as innocuous as no interference at all We show that SVSI is innocuous in the sense that the capacities C j and dispersions V j are not affected Error events are approximately independent Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
68 Intuition Gleaned from GIC with SVSI Carleial (1975) mentioned that Very strong interference is as innocuous as no interference at all We show that SVSI is innocuous in the sense that the capacities C j and dispersions V j are not affected Error events are approximately independent Intuition that the GIC with SVSI is analogous to 2 independent direct channels carries over, even in the finer second-order sense Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
69 Outline 1 Motivation, Background and History 2 Gaussian Interference Channel with Very Strong Interference 3 Gaussian MAC with Degraded Message Sets 4 Conclusion Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
70 Gaussian MAC with Degraded Message Sets M 1 M 2 f 1 f 2 X1 n X2 n Y n ( ˆM W n 1, ˆM 2 ) ϕ Channel law is Y i = X 1i + X 2i + Z i Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
71 Gaussian MAC with Degraded Message Sets M 1 M 2 f 1 f 2 X1 n X2 n Y n ( ˆM W n 1, ˆM 2 ) ϕ Channel law is Y i = X 1i + X 2i + Z i Channel inputs are power limited n Xji 2 ns j, j = 1, 2 i=1 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
72 Capacity Region Capacity region is an exercise in NIT Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
73 Capacity Region Capacity region is an exercise in NIT The capacity region is the set of all (R 1, R 2 ) such that R 1 C(S 1 (1 ρ 2 )) R 1 + R 2 C(S 1 + S 2 + 2ρ S 1 S 2 ) for some 0 ρ 1. Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
74 Capacity Region Capacity region is an exercise in NIT The capacity region is the set of all (R 1, R 2 ) such that R 1 C(S 1 (1 ρ 2 )) R 1 + R 2 C(S 1 + S 2 + 2ρ S 1 S 2 ) for some 0 ρ 1. Uses superposition coding (Cover (1972)) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
75 Capacity Region The Capacity Region CR Boundary ρ = 0 ρ = 1/3 ρ = 2/ R2 (nats/use) R 1 (nats/use) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
76 Characterize L(ε; R 1, R 2 ) This is joint work with Jon Scarlett We want to characterize L(ε; R 1, R 2 ) for all points (R 1, R 2 ) on the boundary of the capacity region Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
77 Characterize L(ε; R 1, R 2 ) This is joint work with Jon Scarlett We want to characterize L(ε; R 1, R 2 ) for all points (R 1, R 2 ) on the boundary of the capacity region For all (R 1, R 2 ) in the interior of the capacity region L(ε; R 1, R 2) = R 2 For all (R 1, R 2 ) in the exterior of the capacity region implying the strong converse L(ε; R 1, R 2) = Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
78 Some Basic Definitions Mutual informations [ ] [ I1 (ρ) C ( (1 ρ I(ρ) := = 2 ) ] )S 1 I 12 (ρ) C ( S 1 + S 2 + 2ρ ) S 1 S 2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
79 Some Basic Definitions Mutual informations [ ] [ I1 (ρ) C ( (1 ρ I(ρ) := = 2 ) ] )S 1 I 12 (ρ) C ( S 1 + S 2 + 2ρ ) S 1 S 2 Derivative of mutual informations D(ρ) := ρ I(ρ) = [ S 1 ρ 1+S 1 (1 ρ 2 ) S1 S 2 ] 1+S 1 +S 2 +2ρ S 1 S 2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
80 Some Basic Definitions Mutual informations [ ] [ I1 (ρ) C ( (1 ρ I(ρ) := = 2 ) ] )S 1 I 12 (ρ) C ( S 1 + S 2 + 2ρ ) S 1 S 2 Derivative of mutual informations D(ρ) := ρ I(ρ) = [ S 1 ρ 1+S 1 (1 ρ 2 ) S1 S 2 ] x(y+2) 2(x+1)(y+1) 1+S 1 +S 2 +2ρ S 1 S 2 Dispersions V(x, y) := and V(x) := V(x, x) [ ] V1 (ρ) V V(ρ) := 1,12 (ρ) V 1,12 (ρ) V 12,12 (ρ) where V 1 (ρ) := V ( (1 ρ 2 ) )S 1, V12,12 (ρ) := V ( S 1 + S 2 + 2ρ ) S 1 S 2 V 1,12 (ρ) := V ( (1 ρ 2 )S 1, S 1 + S 2 + 2ρ ) S 1 S 2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
81 Generalization of Inverse CDF of a Gaussian For a positive semi-definite matrix V, Ψ(z 1, z 2, V) = z1 z2 N (0, V) du Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
82 Generalization of Inverse CDF of a Gaussian For a positive semi-definite matrix V, Ψ(z 1, z 2, V) = Given ε (0, 1), z1 z2 N (0, V) du Ψ 1 (V, ε) = {(z 1, z 2 ) : Ψ( z 1, z 2, V) 1 ε} ρ = ρ = z2 z ε = 0.01 ε = z ε = 0.01 ε = z1 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
83 The Main Result: Vertical Boundary Points on vertical boundary reduce to scalar dispersion as sum rate constraint is in error exponents regime L(ε; R 1, R 2) = {(L 1, L 2 ) : L 1 V 1 (0)Φ 1 (ε)} The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/ R2 (nats/use) R1 (nats/use) (R 1, R 2 ) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
84 The Main Result: Vertical Boundary Points on vertical boundary reduce to scalar dispersion as sum rate constraint is in error exponents regime L(ε; R 1, R 2) = {(L 1, L 2 ) : L 1 V 1 (0)Φ 1 (ε)} R2 (nats/use) The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/ R1 (nats/use) (R 1, R 2 ) Following expansion holds for cardinality of first codebook log M 1n V1 (0) I 1 (0) + Φ 1 (ε) n n Far from sum rate constraint 1 lim n n log(m 1nM 2n ) < I 12 (0) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
85 The Main Result: Curved Boundary Different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 R2 (nats/use) (R 1, R 2 ) R1 (nats/use) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
86 The Main Result: Curved Boundary Different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R R2 (nats/use) The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 (R 1, R 2 ) D(ρ) doesn t appear usually R1 (nats/use) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
87 The Main Result: Curved Boundary Different behavior in the curved region { [ ] L(ε; R 1, R L 2) = (L 1, L 2 ) : 1 } βd(ρ) + Ψ 1 (V(ρ), ε) L 1 + L 2 β R R2 (nats/use) The Capacity Region CR Boundary ρ =0 ρ =1/3 ρ =2/3 (R 1, R 2 ) D(ρ) doesn t appear usually Ψ 1 (V(ρ), ε): corresponds using Gaussian with covariance matrix [ S Σ(ρ) = 1 ρ ] S 1 S 2 ρ S 1 S 2 S R1 (nats/use) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
88 Achieving all Second-Order Rate Pairs L(ε; R 1, R 2) = { [ ] L (L 1, L 2 ) : 1 L 1 + L 2 } βd(ρ) + Ψ 1 (V(ρ), ε) β R Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
89 Achieving all Second-Order Rate Pairs L(ε; R 1, R 2) = { [ ] L (L 1, L 2 ) : 1 L 1 + L 2 } βd(ρ) + Ψ 1 (V(ρ), ε) β R Non-empty regions in CR not in trapezium achievable by ( [ S N 0, 1 ρ ]) S 1 S 2 ρ S 1 S 2 S 2 Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
90 Achieving all Second-Order Rate Pairs L(ε; R 1, R 2) = { [ ] L (L 1, L 2 ) : 1 L 1 + L 2 } βd(ρ) + Ψ 1 (V(ρ), ε) β R Non-empty regions in CR not in trapezium achievable by ( [ S N 0, 1 ρ ]) S 1 S 2 ρ S 1 S 2 S 2 Use the above distribution dependent on blocklength: By a Taylor expansion, ρ n = ρ + β n I(ρ n ) I(ρ) + (ρ n ρ)d(ρ) = I(ρ) + βd(ρ) n explaining the slope term. Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
91 Illustration of Second-Order Coding Rates Ψ 1 (V,ε) L(ε;R 1,R 2) L2(nats/ use) S 1 = S 2 = 1 and ρ = 1 2 L(ε; R 1, R 2 ) is a half-space L 1 (nats/ use) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
92 Illustration of Second-Order Coding Rates Ψ 1 (V,ε) L(ε;R 1,R 2) L2(nats/ use) S 1 = S 2 = 1 and ρ = 1 2 L(ε; R 1, R 2 ) is a half-space L 1 (nats/ use) Second-order rates achieved using a single input distribution N (0, Σ(ρ)) is not optimal Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
93 Illustration of Second-Order Coding Rates Ψ 1 (V,ε) L(ε;R 1,R 2) L2(nats/ use) S 1 = S 2 = 1 and ρ = 1 2 L(ε; R 1, R 2 ) is a half-space L 1 (nats/ use) Second-order rates achieved using a single input distribution N (0, Σ(ρ)) is not optimal Need to vary input distribution with blocklength to achieve all points in L(ε; R 1, R 2 ) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
94 Outline 1 Motivation, Background and History 2 Gaussian Interference Channel with Very Strong Interference 3 Gaussian MAC with Degraded Message Sets 4 Conclusion Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
95 Conclusion Asymptotic expansions for NIT problems with non-vanishing error probabilities is a very fertile area of research Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
96 Conclusion Asymptotic expansions for NIT problems with non-vanishing error probabilities is a very fertile area of research Other single-user that are solved include: 1 Quasi-static MIMO fading channels (Yang-Durisi-Koch-Polyanskiy) 2 Channels with discrete state (Tomamichel-Tan) 3 Dirty-paper coding (Scarlett) Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
97 Conclusion Asymptotic expansions for NIT problems with non-vanishing error probabilities is a very fertile area of research Other single-user that are solved include: 1 Quasi-static MIMO fading channels (Yang-Durisi-Koch-Polyanskiy) 2 Channels with discrete state (Tomamichel-Tan) 3 Dirty-paper coding (Scarlett) Shameless self-promotion: V. Y. F. Tan Asymptotic expansions in IT with non-vanishing error probabilities Now Publishers Foundations and Trends in Comms and Inf. Th. Vincent Tan (NUS) IT with Non-Vanishing Errors Chalmers University / 39
Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets
Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,
More informationOn Third-Order Asymptotics for DMCs
On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationA Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,
More informationQuantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels
Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationChannels with cost constraints: strong converse and dispersion
Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationInformation Theory Meets Game Theory on The Interference Channel
Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University
More informationCorrelation Detection and an Operational Interpretation of the Rényi Mutual Information
Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationInformation Theory for Wireless Communications, Part II:
Information Theory for Wireless Communications, Part II: Lecture 5: Multiuser Gaussian MIMO Multiple-Access Channel Instructor: Dr Saif K Mohammed Scribe: Johannes Lindblom In this lecture, we give the
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationChannel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter
Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory
More informationSurvey of Interference Channel
Survey of Interference Channel Alex Dytso Department of Electrical and Computer Engineering University of Illinois at Chicago, Chicago IL 60607, USA, Email: odytso2 @ uic.edu Abstract Interference Channel(IC)
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationPhysical-Layer MIMO Relaying
Model Gaussian SISO MIMO Gauss.-BC General. Physical-Layer MIMO Relaying Anatoly Khina, Tel Aviv University Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv University August 5, 2011 Model Gaussian
More informationConverse bounds for private communication over quantum channels
Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationSecure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel
Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu
More informationThe PPM Poisson Channel: Finite-Length Bounds and Code Design
August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs
More informationDiversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007
Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT ECE 559 Presentation Hoa Pham Dec 3, 2007 Introduction MIMO systems provide two types of gains Diversity Gain: each path from a transmitter
More informationOn Network Interference Management
On Network Interference Management Aleksandar Jovičić, Hua Wang and Pramod Viswanath March 3, 2008 Abstract We study two building-block models of interference-limited wireless networks, motivated by the
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationHalf-Duplex Gaussian Relay Networks with Interference Processing Relays
Half-Duplex Gaussian Relay Networks with Interference Processing Relays Bama Muthuramalingam Srikrishna Bhashyam Andrew Thangaraj Department of Electrical Engineering Indian Institute of Technology Madras
More informationIN this paper, we show that the scalar Gaussian multiple-access
768 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 5, MAY 2004 On the Duality of Gaussian Multiple-Access and Broadcast Channels Nihar Jindal, Student Member, IEEE, Sriram Vishwanath, and Andrea
More informationA Comparison of Superposition Coding Schemes
A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA
More informationDegrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages
Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationApproximate Capacity of Fast Fading Interference Channels with no CSIT
Approximate Capacity of Fast Fading Interference Channels with no CSIT Joyson Sebastian, Can Karakus, Suhas Diggavi Abstract We develop a characterization of fading models, which assigns a number called
More informationFinite-Blocklength Bounds on the Maximum Coding Rate of Rician Fading Channels with Applications to Pilot-Assisted Transmission
Finite-Blocklength Bounds on the Maximum Coding Rate of Rician Fading Channels with Applications to Pilot-Assisted Transmission Johan Östman, Giuseppe Durisi, and Erik G. Ström Dept. of Signals and Systems,
More informationCovert Communication with Channel-State Information at the Transmitter
Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter
More informationOn the Secrecy Capacity of Fading Channels
On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University
More informationQuantum Achievability Proof via Collision Relative Entropy
Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822
More informationOn Gaussian MIMO Broadcast Channels with Common and Private Messages
On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu
More informationIncremental Coding over MIMO Channels
Model Rateless SISO MIMO Applications Summary Incremental Coding over MIMO Channels Anatoly Khina, Tel Aviv University Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv University Gregory W. Wornell,
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationVariable Rate Channel Capacity. Jie Ren 2013/4/26
Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons
More informationAn Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and
More informationOn the Duality of Gaussian Multiple-Access and Broadcast Channels
On the Duality of Gaussian ultiple-access and Broadcast Channels Xiaowei Jin I. INTODUCTION Although T. Cover has been pointed out in [] that one would have expected a duality between the broadcast channel(bc)
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationDraft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor
Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationOptimal Power Control in Decentralized Gaussian Multiple Access Channels
1 Optimal Power Control in Decentralized Gaussian Multiple Access Channels Kamal Singh Department of Electrical Engineering Indian Institute of Technology Bombay. arxiv:1711.08272v1 [eess.sp] 21 Nov 2017
More informationDecentralized Simultaneous Energy and Information Transmission in Multiple Access Channels
Decentralized Simultaneous Energy and Information Transmission in Multiple Access Channels Selma Belhadj Amor, Samir M. Perlaza To cite this version: Selma Belhadj Amor, Samir M. Perlaza. Decentralized
More informationA Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks
A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum
More informationDegrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel
/33 Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel Presented by Paul de Kerret Joint work with Antonio Bazco, Nicolas Gresset, and David Gesbert ESIT 2017 in Madrid,
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationPrinciples of Communications
Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @
More informationOuter Bounds on the Secrecy Capacity Region of the 2-user Z Interference Channel With Unidirectional Transmitter Cooperation
Outer Bounds on the Secrecy Capacity Region of the 2-user Z Interference Channel With Unidirectional Transmitter Cooperation Parthajit Mohapatra, Chandra R. Murthy, and Jemin Lee itrust, Centre for Research
More informationShannon s A Mathematical Theory of Communication
Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the
More informationInterference Alignment at Finite SNR for TI channels
Interference Alignment at Finite SNR for Time-Invariant channels Or Ordentlich Joint work with Uri Erez EE-Systems, Tel Aviv University ITW 20, Paraty Background and previous work The 2-user Gaussian interference
More informationError Exponent Region for Gaussian Broadcast Channels
Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI
More informationA new converse in rate-distortion theory
A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds
More informationDecentralized K-User Gaussian Multiple Access Channels
Decentralized K-User Gaussian Multiple Access Channels Selma Belhadj Amor, Samir Perlaza To cite this version: Selma Belhadj Amor, Samir Perlaza. Decentralized K-User Gaussian Multiple Access Channels.
More informationCompetition and Cooperation in Multiuser Communication Environments
Competition and Cooperation in Multiuser Communication Environments Wei Yu Electrical Engineering Department Stanford University April, 2002 Wei Yu, Stanford University Introduction A multiuser communication
More informationThe Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference
The and The Binary Dirty Multiple Access Channel with Common Interference Dept. EE - Systems, Tel Aviv University, Tel Aviv, Israel April 25th, 2010 M.Sc. Presentation The B/G Model Compound CSI Smart
More informationAhmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015
The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department
More informationRevision of Lecture 5
Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information
More informationACOMMUNICATION situation where a single transmitter
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 9, SEPTEMBER 2004 1875 Sum Capacity of Gaussian Vector Broadcast Channels Wei Yu, Member, IEEE, and John M. Cioffi, Fellow, IEEE Abstract This paper
More informationEfficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel
Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca
More informationSimultaneous SDR Optimality via a Joint Matrix Decomp.
Simultaneous SDR Optimality via a Joint Matrix Decomposition Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv Uni. May 26, 2011 Model: Source Multicasting over MIMO Channels z 1 H 1 y 1 Rx1 ŝ 1 s
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationSparse Regression Codes for Multi-terminal Source and Channel Coding
Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationOn the Power Allocation for Hybrid DF and CF Protocol with Auxiliary Parameter in Fading Relay Channels
On the Power Allocation for Hybrid DF and CF Protocol with Auxiliary Parameter in Fading Relay Channels arxiv:4764v [csit] 4 Dec 4 Zhengchuan Chen, Pingyi Fan, Dapeng Wu and Liquan Shen Tsinghua National
More informationLecture 7 MIMO Communica2ons
Wireless Communications Lecture 7 MIMO Communica2ons Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Fall 2014 1 Outline MIMO Communications (Chapter 10
More informationTransmitting k samples over the Gaussian channel: energy-distortion tradeoff
Transmitting samples over the Gaussian channel: energy-distortion tradeoff Victoria Kostina Yury Polyansiy Sergio Verdú California Institute of Technology Massachusetts Institute of Technology Princeton
More informationA New Metaconverse and Outer Region for Finite-Blocklength MACs
A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region
More informationOptimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information
SUBMIED O IEEE INERNAIONAL SYMPOSIUM ON INFORMAION HEORY, DE. 23 1 Optimal Power Allocation for Parallel Gaussian Broadcast hannels with Independent and ommon Information Nihar Jindal and Andrea Goldsmith
More informationCoding over Interference Channels: An Information-Estimation View
Coding over Interference Channels: An Information-Estimation View Shlomo Shamai Department of Electrical Engineering Technion - Israel Institute of Technology Information Systems Laboratory Colloquium
More informationSimultaneous Information and Energy Transmission: A Finite Block-Length Analysis
Simultaneous Information and Energy Transmission: A Finite Block-Length Analysis Samir Perlaza, Ali Tajer, H. Vincent Poor To cite this version: Samir Perlaza, Ali Tajer, H. Vincent Poor. Simultaneous
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationComputing and Communications 2. Information Theory -Entropy
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy
More informationClassical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo
Classical communication over classical channels using non-classical correlation. Will Matthews IQC @ University of Waterloo 1 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet
More information820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye
820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY 2012 Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results Stefano Rini, Daniela Tuninetti,
More informationMulti-User Gain Maximum Eigenmode Beamforming, and IDMA. Peng Wang and Li Ping City University of Hong Kong
Multi-User Gain Maximum Eigenmode Beamforming, and IDMA Peng Wang and Li Ping City University of Hong Kong 1 Contents Introduction Multi-user gain (MUG) Maximum eigenmode beamforming (MEB) MEB performance
More informationDuality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels
2658 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 49, NO 10, OCTOBER 2003 Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels Sriram Vishwanath, Student Member, IEEE, Nihar
More informationA Systematic Approach for Interference Alignment in CSIT-less Relay-Aided X-Networks
A Systematic Approach for Interference Alignment in CSIT-less Relay-Aided X-Networks Daniel Frank, Karlheinz Ochs, Aydin Sezgin Chair of Communication Systems RUB, Germany Email: {danielfrank, karlheinzochs,
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationInteractions of Information Theory and Estimation in Single- and Multi-user Communications
Interactions of Information Theory and Estimation in Single- and Multi-user Communications Dongning Guo Department of Electrical Engineering Princeton University March 8, 2004 p 1 Dongning Guo Communications
More informationSimultaneous Nonunique Decoding Is Rate-Optimal
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA
More informationDecentralized Simultaneous Information and Energy Transmission in K-User Multiple Access Channels
Decentralized Simultaneous Information and Energy Transmission in K-User Multiple Access Channels Selma Belhadj Amor, Samir M. Perlaza, and H. Vincent Poor 1 arxiv:1606.04432v2 [cs.it] 26 Sep 2017 Abstract
More informationCapacity of Block Rayleigh Fading Channels Without CSI
Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu
More informationSemidefinite programming strong converse bounds for quantum channel capacities
Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Runyao Duan and Wei Xie arxiv:1610.06381 & 1601.06888
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationInformation-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery
Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)
More informationMultiple-Input Multiple-Output Systems
Multiple-Input Multiple-Output Systems What is the best way to use antenna arrays? MIMO! This is a totally new approach ( paradigm ) to wireless communications, which has been discovered in 95-96. Performance
More informationDecoupling of CDMA Multiuser Detection via the Replica Method
Decoupling of CDMA Multiuser Detection via the Replica Method Dongning Guo and Sergio Verdú Dept. of Electrical Engineering Princeton University Princeton, NJ 08544, USA email: {dguo,verdu}@princeton.edu
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationRCA Analysis of the Polar Codes and the use of Feedback to aid Polarization at Short Blocklengths
RCA Analysis of the Polar Codes and the use of Feedback to aid Polarization at Short Blocklengths Kasra Vakilinia, Dariush Divsalar*, and Richard D. Wesel Department of Electrical Engineering, University
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More information