Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo
|
|
- Piers Norris
- 5 years ago
- Views:
Transcription
1 Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo 1
2 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet A, output alphabet B. 2
3 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet A, output alphabet B. Channel input X in A n, output Y in B n (random variables). 2
4 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet A, output alphabet B. Channel input X in A n, output Y in B n (random variables). Pr(Y = y X = x, E) = E n (y x). Discrete Memoryless Channel: E n (y x) = n i=1 E(y i x i ). 2
5 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet A, output alphabet B. Channel input X in A n, output Y in B n (random variables). Pr(Y = y X = x, E) = E n (y x). Discrete Memoryless Channel: E n (y x) = n i=1 E(y i x i ). Message Q, decoded message ˆQ in [M] := {1,..., M} 2
6 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet A, output alphabet B. Channel input X in A n, output Y in B n (random variables). Pr(Y = y X = x, E) = E n (y x). Discrete Memoryless Channel: E n (y x) = n i=1 E(y i x i ). Message Q, decoded message ˆQ in [M] := {1,..., M} Classical code: Encoder Pr(X = x Q = q) = F (x q); 2
7 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet A, output alphabet B. Channel input X in A n, output Y in B n (random variables). Pr(Y = y X = x, E) = E n (y x). Discrete Memoryless Channel: E n (y x) = n i=1 E(y i x i ). Message Q, decoded message ˆQ in [M] := {1,..., M} Classical code: Encoder Pr(X = x Q = q) = F (x q); Decoder Pr( ˆQ = ˆq Y = y) = G(ˆq y). 2
8 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet A, output alphabet B. Channel input X in A n, output Y in B n (random variables). Pr(Y = y X = x, E) = E n (y x). Discrete Memoryless Channel: E n (y x) = n i=1 E(y i x i ). Message Q, decoded message ˆQ in [M] := {1,..., M} Classical code: Encoder Pr(X = x Q = q) = F (x q); Decoder Pr( ˆQ = ˆq Y = y) = G(ˆq y). M ɛ (E n ) :=max M such that code with Pr(Q ˆQ E n, F, G) ɛ (hard to compute exactly). 2
9 Classical data over classical channels Q E n Y X ˆQ Correlated Encoder/Decoder: Pr(X = x, ˆQ = ˆq Q = q, Y = y) = Z(x, ˆq q, y) 3
10 Classical data over classical channels Q X E n Y ˆQ Correlated Encoder/Decoder: Pr(X = x, ˆQ = ˆq Q = q, Y = y) = Z(x, ˆq q, y) Entanglement assisted: Z(x, ˆq q, y) = Tr F (q) x G (y) ˆq ρ AB 3
11 Classical data over classical channels Q E n Y X ˆQ Correlated Encoder/Decoder: Pr(X = x, ˆQ = ˆq Q = q, Y = y) = Z(x, ˆq q, y) Entanglement assisted: Z(x, ˆq q, y) = Tr F (q) x G (y) ˆq ρ AB Non-signalling assisted: Z(x, ˆq q, y) is non-signalling. 3
12 Classical data over classical channels Q E n Y X ˆQ Correlated Encoder/Decoder: Pr(X = x, ˆQ = ˆq Q = q, Y = y) = Z(x, ˆq q, y) Entanglement assisted: Z(x, ˆq q, y) = Tr F (q) x G (y) ˆq ρ AB Non-signalling assisted: Z(x, ˆq q, y) is non-signalling. M E ɛ (E n ) :=max M such that entanglement assisted code with Pr(Q ˆQ E n, Z) ɛ. 3
13 Capacity C ɛ (E) := lim n 1 n log M ɛ(e n ), C(E) := lim ɛ 0 C Ω ɛ (E) 4
14 Capacity C ɛ (E) := lim n 1 n log M ɛ(e n ), C(E) := lim ɛ 0 C Ω ɛ (E) In general, zero-error capacity C 0 (E) C(E). 4
15 Capacity C ɛ (E) := lim n 1 n log M ɛ(e n ), C(E) := lim ɛ 0 C Ω ɛ (E) In general, zero-error capacity C 0 (E) C(E). Bennett, Shor, Smolin, Thapliyal (2001): For all DMCs: C E (E) = C(E) (quant-ph/ ). 4
16 Capacity C ɛ (E) := lim n 1 n log M ɛ(e n ), C(E) := lim ɛ 0 C Ω ɛ (E) In general, zero-error capacity C 0 (E) C(E). Bennett, Shor, Smolin, Thapliyal (2001): For all DMCs: C E (E) = C(E) (quant-ph/ ). Entanglement doesn t change capacity. But.. 4
17 Capacity C ɛ (E) := lim n 1 n log M ɛ(e n ), C(E) := lim ɛ 0 C Ω ɛ (E) In general, zero-error capacity C 0 (E) C(E). Bennett, Shor, Smolin, Thapliyal (2001): For all DMCs: C E (E) = C(E) (quant-ph/ ). Entanglement doesn t change capacity. But.. Entanglement can increase rate for fixed n and ɛ / reduce error prob. for fixed rate and n Cubitt, Leung, W.M., Winter: ( and ) Demonstration by Resch group at IQC ( ) 4
18 Capacity C ɛ (E) := lim n 1 n log M ɛ(e n ), C(E) := lim ɛ 0 C Ω ɛ (E) In general, zero-error capacity C 0 (E) C(E). Bennett, Shor, Smolin, Thapliyal (2001): For all DMCs: C E (E) = C(E) (quant-ph/ ). Entanglement doesn t change capacity. But.. Entanglement can increase rate for fixed n and ɛ / reduce error prob. for fixed rate and n Cubitt, Leung, W.M., Winter: ( and ) Demonstration by Resch group at IQC ( ) Leung, Mancinska, W.M., Ozols, Roy ( ): Example of DMC with C(E) = C E 0 (E) > C 0(E). 4
19 Motivation: Beyond Capacity Converse and achievability bounds on the rate 1 n log M ɛ(e n ) when ɛ = 1/1000 and E is the BSC with Pr(bit flip) = Capacity 0.4 Converse Rate 0.3 Achievability yond Capaci Blocklength n log ( n ) 1 Polyanskiy, Poor, Verdú. IEEE Trans. Inf. T., 56,
20 Motivation: Beyond Capacity Asymptotics of rate as function of n for DMCs: 1 V (E) n log M ɛ(e n ) = C(E) Q 1 (ɛ) + O(log n)/n n By Polyanskiy, Poor, Verdú (PPV converse) converse and achievability results for classical codes which match up to O(log n)/n. 6
21 Motivation: Beyond Capacity Asymptotics of rate as function of n for DMCs: 1 V (E) n log M ɛ(e n ) = C(E) Q 1 (ɛ) + O(log n)/n n By Polyanskiy, Poor, Verdú (PPV converse) converse and achievability results for classical codes which match up to O(log n)/n. Use of entanglement can t change capacity but can give striking qualitative changes to error behaviour below capacity. 6
22 Motivation: Beyond Capacity Asymptotics of rate as function of n for DMCs: 1 V (E) n log M ɛ(e n ) = C(E) Q 1 (ɛ) + O(log n)/n n By Polyanskiy, Poor, Verdú (PPV converse) converse and achievability results for classical codes which match up to O(log n)/n. Use of entanglement can t change capacity but can give striking qualitative changes to error behaviour below capacity. Goal: Looking beyond zero-error, quantify the extent to which entanglement improves coding. 6
23 Motivation: Beyond Capacity Asymptotics of rate as function of n for DMCs: 1 V (E) n log M ɛ(e n ) = C(E) Q 1 (ɛ) + O(log n)/n n By Polyanskiy, Poor, Verdú (PPV converse) converse and achievability results for classical codes which match up to O(log n)/n. Use of entanglement can t change capacity but can give striking qualitative changes to error behaviour below capacity. Goal: Looking beyond zero-error, quantify the extent to which entanglement improves coding. E.g. what does the asymptotic expansion of rate as a function of n look like after the capacity term with entanglement assistance? 6
24 PPV converse Q F Y E X G ˆQ 7
25 PPV converse Q F Y E X G ˆQ P XY : Pr(X = a) = 1 M q F (x q) =: p x, Pr(Y = y X = x) = E(y x). 7
26 PPV converse Q F X R Y G ˆQ P XY : Pr(X = a) = 1 M q F (x q) =: p x, Pr(Y = y X = x) = E(y x). P X R Y : Pr(X = a) = p x, Pr(Y = y X = x) = r y. 7
27 PPV converse Q F X R Y G ˆQ P XY : Pr(X = a) = 1 M q F (x q) =: p x, Pr(Y = y X = x) = E(y x). P X R Y : Pr(X = a) = p x, Pr(Y = y X = x) = r y. Test T for P XY : Specifies Pr(pass T X = x, Y = y). 7
28 PPV converse Q F X R Y G ˆQ P XY : Pr(X = a) = 1 M q F (x q) =: p x, Pr(Y = y X = x) = E(y x). P X R Y : Pr(X = a) = p x, Pr(Y = y X = x) = r y. Test T for P XY : Specifies Pr(pass T X = x, Y = y). β ɛ (P XY, P X R Y ): Minimum Pr(pass T P X R Y ) for all T with Pr(pass T P XY ) 1 ɛ. 7
29 PPV converse Given (M, ɛ) code for E with input distribution P X, construct test T for P XY (vs. P X R Y ): Pr(pass T X = x, Y = y) = T xy := q [M] G(q y) Pr(Q = q X = x, F ). 8
30 PPV converse Given (M, ɛ) code for E with input distribution P X, construct test T for P XY (vs. P X R Y ): Pr(pass T X = x, Y = y) = T xy := q [M] G(q y) Pr(Q = q X = x, F ). For P XY : Pr(X = x, Y = y) = p x E(y x), the probability of passing the test is T xy p x E(y x) = G(q y)e(y x)p x Pr(Q = q X = x, F ) x,y,q x,y = x,y,q G(q y)e(y x) Pr(Q = q X = x F ) = 1 M G(q y)e(y x)f (x q) x,y,q =1 ɛ. 8
31 PPV converse Given (M, ɛ) code for E with input distribution P X, construct test T for P XY (vs. P X R Y ): Pr(pass T X = x, Y = y) = T xy := q [M] G(q y) Pr(Q = q X = x, F ). For P X R Y : Pr(X = x, Y = y) = p x r y, the probability of passing the test is T xy p x r y = G(q y)r y p x Pr(Q = q X = x, F ) x,y,q x,y = x,y,q G(q y)r y Pr(Q = q X = x F ) = 1 M = 1 M G(q y)r y F (x q) x,y,q G(q y)r y y,q =1/M. 9
32 PPV converse If there exists a code of size M with distribution P X for channel input X, and error prob. ɛ, then for all distributions R Y on Y : β ɛ (P XY, P X R Y ) 1 M. 10
33 PPV converse If there exists a code of size M with distribution P X for channel input X, and error prob. ɛ, then for all distributions R Y on Y : β ɛ (P XY, P X R Y ) 1 M. For all codes of size M and error prob. ɛ min max β ɛ (P XY, P X R Y ) 1 P X R Y M. 10
34 PPV converse If there exists a code of size M with distribution P X for channel input X, and error prob. ɛ, then for all distributions R Y on Y : β ɛ (P XY, P X R Y ) 1 M. For all codes of size M and error prob. ɛ PPV converse: min max β ɛ (P XY, P X R Y ) 1 P X R Y M. M ɛ (W ) Mɛ 1 (W ) := max min P X R Y β ɛ (P XY, P X R Y ) 10
35 Non-signalling assisted codes: Minimum error LP q [M] x A E y B ˆq [M] ɛ NS (M, E) :=1 max 1 M subject to Z(x, q q, y)e(y x) q [M] x A y B ˆq [M], y B : Z(x, ˆq q, y) = 1, Z 0 x,ˆq ˆq, y, q, q : Z(x, ˆq q, y) = Z(x, ˆq q, y) x A x A x, q, y, y : Z(x, ˆq q, y) = Z(x, ˆq q, y ) ˆq [M] ˆq [M] 11
36 Simplified LP Let π(q) denote the image of q [M] under permutation π S M. Z(x, ˆq q, y) = 1 Z(x, π(ˆq) π(q), y) M! π S M 12
37 Simplified LP Let π(q) denote the image of q [M] under permutation π S M. Z(x, ˆq q, y) = 1 Z(x, π(ˆq) π(q), y) M! π S M Z is still NS, yields same error prob. and it is symmetric under simultaneous permutation of q and ˆq. 12
38 Simplified LP Let π(q) denote the image of q [M] under permutation π S M. Z(x, ˆq q, y) = 1 Z(x, π(ˆq) π(q), y) M! π S M Z is still NS, yields same error prob. and it is symmetric under simultaneous permutation of q and ˆq. This symmetry is equivalent to Z having the form: { D xy if ˆq = q Z(x, ˆq q, y) = W xy if ˆq q. 12
39 Simplified LP Let π(q) denote the image of q [M] under permutation π S M. Z(x, ˆq q, y) = 1 Z(x, π(ˆq) π(q), y) M! π S M Z is still NS, yields same error prob. and it is symmetric under simultaneous permutation of q and ˆq. This symmetry is equivalent to Z having the form: { D xy if ˆq = q Z(x, ˆq q, y) = W xy if ˆq q. Substitute this form into the LP for the error... 12
40 Simplified LP ɛ NS (M, E) := 1 max 1 M Z(x, q q, y)e(y x) q [M] x A y B ˆq [M], y B : Z(x, ˆq q, y) = 1 x,ˆq Z 0 ˆq, y, q, q : Z(x, ˆq q, y) = Z(x, ˆq q, y) x A x A x, q, y, y : Z(x, ˆq q, y) = Z(x, ˆq q, y ) ˆq [M] ˆq [M] 13
41 Simplified LP ɛ NS (M, E) := 1 max x A D xy E(y x) y B ˆq [M], y B : Z(x, ˆq q, y) = 1 x,ˆq Z 0 ˆq, y, q, q : Z(x, ˆq q, y) = Z(x, ˆq q, y) x A x A x, q, y, y : Z(x, ˆq q, y) = Z(x, ˆq q, y ) ˆq [M] ˆq [M] 13
42 Simplified LP ɛ NS (M, E) := 1 max x A D xy E(y x) y B y B : D xy + (M 1)W xy = 1 x R 0, W 0 ˆq, y, q, q : Z(x, ˆq q, y) = Z(x, ˆq q, y) x A x A x, q, y, y : Z(x, ˆq q, y) = Z(x, ˆq q, y ) ˆq [M] ˆq [M] 13
43 Simplified LP ɛ NS (M, E) := 1 max x A D xy E(y x) y B y B : D xy + (M 1)W xy = 1 x R 0, W 0 y B : D xy = W xy x A x A x, q, y, y : Z(x, ˆq q, y) = Z(x, ˆq q, y ) ˆq [M] ˆq [M] 13
44 Simplified LP ɛ NS (M, E) := 1 max x A D xy E(y x) y B y B : D xy + (M 1)W xy = 1 x R 0, W 0 y B : D xy = W xy x A x A x, y : D xy + (M 1)W xy = p x 13
45 Simplified LP ɛ NS (M, E) := 1 max x A D xy E(y x) y B subject to y B : D xy = 1/M(= γ) x A p x = 1, x A x A, y B : D xy p x, x A, y B : D xy 0. 14
46 LP for M NS ɛ Can rewrite to find the smallest value of 1/M for a given ɛ: Mɛ NS (E) 1 = min γ x A, y B : D xy p x y B : x A D xy γ E(y x)d xy 1 ɛ x A y B p x = 1 x A x A, y B : D xy 0, p x 0. 15
47 LP for M NS ɛ Can rewrite to find the smallest value of 1/M for a given ɛ: Mɛ NS (E) 1 = min γ x A, y B : D xy p x y B : x A D xy γ E(y x)d xy 1 ɛ x A y B p x = 1 x A x A, y B : D xy 0, p x 0. Also gives a converse for classical codes: M ɛ (E) Mɛ NS (E). 15
48 Equivalence to PPV converse The PPV converse is M ɛ (E) 1 = min p max r min T subject to E(y x)t xy p x 1 ɛ x A y B p x = 1, x y r y = 1 T xy p x r y x A y B x A, y B : p x 0, 0 T xy 1, r y 0 16
49 Equivalence to PPV converse The PPV converse is M ɛ (E) 1 = min p min T max r subject to E(y x)t xy p x 1 ɛ x A y B p x = 1, x y r y = 1 T xy p x r y x A y B x A, y B : p x 0, 0 T xy 1, r y 0 By von Neumann s minimax theorem. 16
50 Equivalence to PPV converse The PPV converse is M ɛ (E) 1 = min p min T max y B T xy p x x A subject to E(y x)t xy p x 1 ɛ x A y B p x = 1 x x A, y B : p x 0, 0 T xy 1 By convexity of optimisation over r. 16
51 Equivalence to PPV converse The PPV converse is Mɛ (E) 1 = min max D,p y B x A subject to E(y x)d xy 1 ɛ x A y B p x = 1 x D xy x A, y B : p x 0, 0 T xy 1 D xy := T xy p x 16
52 Equivalence to PPV converse The PPV converse is Mɛ (E) 1 = min max D,p y B x A subject to E(y x)d xy 1 ɛ x A y B p x = 1 x D xy x A, y B : p x 0, 0 D xy p x, D xy := T xy p x 16
53 Equivalence to PPV converse The PPV converse is M ɛ (E) 1 = min D,p γ subject to E(y x)d xy 1 ɛ x A y B p x = 1 x x A, y B : p x 0, 0 D xy p x, y B : γ x A D xy D xy := T xy p x 16
54 Equivalence to PPV converse The PPV converse is M ɛ (E) 1 = min D,p γ subject to E(y x)d xy 1 ɛ x A y B p x = 1 x x A, y B : p x 0, 0 D xy p x, y B : γ x A D xy Mɛ (E) = Mɛ NS (E) 16
55 Consequences All converse results which can be derived from PPV converse apply to non-signalling and entanglement assisted codes too (for discrete channels): 17
56 Consequences All converse results which can be derived from PPV converse apply to non-signalling and entanglement assisted codes too (for discrete channels): Non-signalling doesn t change capacity for completely general channels (using result of Verdú and Han). 17
57 Consequences All converse results which can be derived from PPV converse apply to non-signalling and entanglement assisted codes too (for discrete channels): Non-signalling doesn t change capacity for completely general channels (using result of Verdú and Han). Asymptotic expansion of rate unchanged by NS/entanglement assistance up to O(log n)/n. 17
58 Consequences All converse results which can be derived from PPV converse apply to non-signalling and entanglement assisted codes too (for discrete channels): Non-signalling doesn t change capacity for completely general channels (using result of Verdú and Han). Asymptotic expansion of rate unchanged by NS/entanglement assistance up to O(log n)/n. Linear program formulation of the PPV converse: 17
59 Consequences All converse results which can be derived from PPV converse apply to non-signalling and entanglement assisted codes too (for discrete channels): Non-signalling doesn t change capacity for completely general channels (using result of Verdú and Han). Asymptotic expansion of rate unchanged by NS/entanglement assistance up to O(log n)/n. Linear program formulation of the PPV converse: Dual formulation of LP gives a converse bound for any feasible point. 17
60 Consequences All converse results which can be derived from PPV converse apply to non-signalling and entanglement assisted codes too (for discrete channels): Non-signalling doesn t change capacity for completely general channels (using result of Verdú and Han). Asymptotic expansion of rate unchanged by NS/entanglement assistance up to O(log n)/n. Linear program formulation of the PPV converse: Dual formulation of LP gives a converse bound for any feasible point. For channels with permutation covariance (e.g. DMCs), LP can be simplified to size poly(n). 17
61 Dispersionless channels Information density: i(a; b) = log Pr(X=a,Y =b) Pr(X=a) Pr(Y =b). 18
62 Dispersionless channels Information density: i(a; b) = log Pr(X=a,Y =b) Pr(X=a) Pr(Y =b). Capacity C(E) is max. of expectation of information density over input distributions. 18
63 Dispersionless channels Information density: i(a; b) = log Pr(X=a,Y =b) Pr(X=a) Pr(Y =b). Capacity C(E) is max. of expectation of information density over input distributions. Dispersion V (E): Minimum variance of information density for capacity achieving input distributions. 18
64 Dispersionless channels Information density: i(a; b) = log Pr(X=a,Y =b) Pr(X=a) Pr(Y =b). Capacity C(E) is max. of expectation of information density over input distributions. Dispersion V (E): Minimum variance of information density for capacity achieving input distributions. 1 V (E) n log M ɛ(e n ) = C(E) Q 1 (ɛ) + O(log n)/n n 18
65 Dispersionless channels Information density: i(a; b) = log Pr(X=a,Y =b) Pr(X=a) Pr(Y =b). Capacity C(E) is max. of expectation of information density over input distributions. Dispersion V (E): Minimum variance of information density for capacity achieving input distributions. 1 V (E) n log M ɛ(e n ) = C(E) Q 1 (ɛ) + O(log n)/n n For DMCs C NS 0 (E) = C(E) V (E) = 0. 18
66 Future work Use dual LP formulation of converse to improve efficiently computable finite block length bounds for specific channels. 19
67 Future work Use dual LP formulation of converse to improve efficiently computable finite block length bounds for specific channels. Improved bounds for small block lengths by augmenting LP with certain families of Bell inequalities? 19
68 Future work Use dual LP formulation of converse to improve efficiently computable finite block length bounds for specific channels. Improved bounds for small block lengths by augmenting LP with certain families of Bell inequalities? Non-signalling converse generalises naturally to multi-terminal coding - investigate applications. 19
69 Future work Use dual LP formulation of converse to improve efficiently computable finite block length bounds for specific channels. Improved bounds for small block lengths by augmenting LP with certain families of Bell inequalities? Non-signalling converse generalises naturally to multi-terminal coding - investigate applications. 19
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationOn Third-Order Asymptotics for DMCs
On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationSemidefinite programming strong converse bounds for quantum channel capacities
Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Wei Xie, Runyao Duan (UTS:QSI) QIP 2017, Microsoft
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationSemidefinite programming strong converse bounds for quantum channel capacities
Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Runyao Duan and Wei Xie arxiv:1610.06381 & 1601.06888
More informationDegradable Quantum Channels
Degradable Quantum Channels Li Yu Department of Physics, Carnegie-Mellon University, Pittsburgh, PA May 27, 2010 References The capacity of a quantum channel for simultaneous transmission of classical
More informationChannels with cost constraints: strong converse and dispersion
Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationSeparation between quantum Lovász number and entanglement-assisted zero-error classical capacity
Separation between quantum Lovász number and entanglement-assisted zero-error classical capacity Xin Wang QCIS, University of Technology Sydney (UTS) Joint work with Runyao Duan (UTS), arxiv: 1608.04508
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationThe PPM Poisson Channel: Finite-Length Bounds and Code Design
August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationChannel Dispersion and Moderate Deviations Limits for Memoryless Channels
Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability
More informationSecond-Order Asymptotics for the Gaussian MAC with Degraded Message Sets
Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,
More informationQuantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels
Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationThe Lovász ϑ-function in Quantum Mechanics
The Lovász ϑ-function in Quantum Mechanics Zero-error Quantum Information and Noncontextuality Simone Severini UCL Oxford Jan 2013 Simone Severini (UCL) Lovász ϑ-function in Quantum Mechanics Oxford Jan
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationClassical and Quantum Channel Simulations
Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon
More informationVariable Rate Channel Capacity. Jie Ren 2013/4/26
Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons
More informationCorrelation Detection and an Operational Interpretation of the Rényi Mutual Information
Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum
More informationRemarks on the Additivity Conjectures for Quantum Channels
Contemporary Mathematics Remarks on the Additivity Conjectures for Quantum Channels Christopher King Abstract. In this article we present the statements of the additivity conjectures for quantum channels,
More informationA New Metaconverse and Outer Region for Finite-Blocklength MACs
A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationAsymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan
More informationAn Extended Fano s Inequality for the Finite Blocklength Coding
An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.
More informationarxiv: v1 [quant-ph] 3 Jan 2008
A paradigm for entanglement theory based on quantum communication Jonathan Oppenheim 1 1 Department of Applied Mathematics and Theoretical Physics, University of Cambridge U.K. arxiv:0801.0458v1 [quant-ph]
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationStatistical Inference
Statistical Inference Classical and Bayesian Methods Class 7 AMS-UCSC Tue 31, 2012 Winter 2012. Session 1 (Class 7) AMS-132/206 Tue 31, 2012 1 / 13 Topics Topics We will talk about... 1 Hypothesis testing
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationVariable-length coding with feedback in the non-asymptotic regime
Variable-length coding with feedback in the non-asymptotic regime Yury Polyanskiy, H. Vincent Poor, and Sergio Verdú Abstract Without feedback, the backoff from capacity due to non-asymptotic blocklength
More informationDiscrete Memoryless Channels with Memoryless Output Sequences
Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg
More informationEntanglement: concept, measures and open problems
Entanglement: concept, measures and open problems Division of Mathematical Physics Lund University June 2013 Project in Quantum information. Supervisor: Peter Samuelsson Outline 1 Motivation for study
More informationSimple Channel Coding Bounds
ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,
More informationDegrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages
Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationClassical data compression with quantum side information
Classical data compression with quantum side information I Devetak IBM TJ Watson Research Center, Yorktown Heights, NY 10598, USA arxiv:quant-ph/0209029v3 18 Apr 2003 A Winter Department of Computer Science,
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationLimits on classical communication from quantum entropy power inequalities
Limits on classical communication from quantum entropy power inequalities Graeme Smith, IBM Research (joint work with Robert Koenig) QIP 2013 Beijing Channel Capacity X N Y p(y x) Capacity: bits per channel
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationRefined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities
Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC
More informationQuantum Achievability Proof via Collision Relative Entropy
Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationarxiv: v3 [quant-ph] 30 Aug 2017
Semidefinite programming strong converse bounds for classical capacity Xin Wang, * Wei Xie, and Runyao Duan, Centre for Quantum Software and Information, Faculty of Engineering and Information Technology,
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationSome Bipartite States Do Not Arise from Channels
Some Bipartite States Do Not Arise from Channels arxiv:quant-ph/0303141v3 16 Apr 003 Mary Beth Ruskai Department of Mathematics, Tufts University Medford, Massachusetts 0155 USA marybeth.ruskai@tufts.edu
More informationEntropy in Classical and Quantum Information Theory
Entropy in Classical and Quantum Information Theory William Fedus Physics Department, University of California, San Diego. Entropy is a central concept in both classical and quantum information theory,
More informationIterative Quantization. Using Codes On Graphs
Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs Lossy Data Compression: Encoding: Map source
More informationQuantum Information Theory and Cryptography
Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,
More informationA new converse in rate-distortion theory
A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds
More informationQuantum Hadamard channels (II)
Quantum Hadamard channels (II) Vlad Gheorghiu Department of Physics Carnegie Mellon University Pittsburgh, PA 15213, USA August 5, 2010 Vlad Gheorghiu (CMU) Quantum Hadamard channels (II) August 5, 2010
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationDistributed Lossless Compression. Distributed lossless compression system
Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf
More informationLower Bounds on the Graphical Complexity of Finite-Length LDPC Codes
Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International
More informationLECTURE 10. Last time: Lecture outline
LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationX = X X n, + X 2
CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk
More informationConverse bounds for private communication over quantum channels
Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898
More informationChapter 14: Quantum Information Theory and Photonic Communications
Chapter 14: Quantum Information Theory and Photonic Communications Farhan Rana (MIT) March, 2002 Abstract The fundamental physical limits of optical communications are discussed in the light of the recent
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationLecture 2: Minimax theorem, Impagliazzo Hard Core Lemma
Lecture 2: Minimax theorem, Impagliazzo Hard Core Lemma Topics in Pseudorandomness and Complexity Theory (Spring 207) Rutgers University Swastik Kopparty Scribe: Cole Franks Zero-sum games are two player
More informationMemory in Classical Information Theory: A Brief History
Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationPractical Polar Code Construction Using Generalised Generator Matrices
Practical Polar Code Construction Using Generalised Generator Matrices Berksan Serbetci and Ali E. Pusane Department of Electrical and Electronics Engineering Bogazici University Istanbul, Turkey E-mail:
More informationOn Composite Quantum Hypothesis Testing
University of York 7 November 207 On Composite Quantum Hypothesis Testing Mario Berta Department of Computing with Fernando Brandão and Christoph Hirche arxiv:709.07268 Overview Introduction 2 Composite
More informationAn Achievable Error Exponent for the Mismatched Multiple-Access Channel
An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of
More informationDraft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor
Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton
More informationSolutions to Homework Set #3 Channel and Source coding
Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)
More informationEntropies & Information Theory
Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationQuantum rate distortion, reverse Shannon theorems, and source-channel separation
Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, Mark Wilde (1) University of Cambridge,U.K. (2) McGill University, Montreal, Canada Classical
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationfor some error exponent E( R) as a function R,
. Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationA multiple access system for disjunctive vector channel
Thirteenth International Workshop on Algebraic and Combinatorial Coding Theory June 15-21, 2012, Pomorie, Bulgaria pp. 269 274 A multiple access system for disjunctive vector channel Dmitry Osipov, Alexey
More informationGaussian Classical capacity of Gaussian Quantum channels
Gaussian Classical capacity of Gaussian Quantum channels E. KARPOV, J. SCHÄFER, O. PILYAVETS and N. J. CERF Centre for Quantum InforamIon and CommunicaIon Université Libre de Bruxelles MathemaIcal Challenges
More informationA Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks
A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum
More informationRandomized algorithm
Tutorial 4 Joyce 2009-11-24 Outline Solution to Midterm Question 1 Question 2 Question 1 Question 2 Question 3 Question 4 Question 5 Solution to Midterm Solution to Midterm Solution to Midterm Question
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationIN [1], Forney derived lower bounds on the random coding
Exact Random Coding Exponents for Erasure Decoding Anelia Somekh-Baruch and Neri Merhav Abstract Random coding of channel decoding with an erasure option is studied By analyzing the large deviations behavior
More informationInformation Theory Meets Quantum Physics
Information Theory Meets Quantum Physics The magic of wave dynamics Apoorva Patel Centre for High Energy Physics Indian Institute of Science, Bangalore 30 April 2016 (Commemorating the 100th birthday of
More informationRandom Variable. Pr(X = a) = Pr(s)
Random Variable Definition A random variable X on a sample space Ω is a real-valued function on Ω; that is, X : Ω R. A discrete random variable is a random variable that takes on only a finite or countably
More informationAn Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels
POST-PRIT OF THE IEEE TRAS. O IFORMATIO THEORY, VOL. 54, O. 5, PP. 96 99, MAY 8 An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels Gil Wiechman Igal Sason Department
More information