Phase Transitions of Random Codes and GV-Bounds

Size: px
Start display at page:

Download "Phase Transitions of Random Codes and GV-Bounds"

Transcription

1 Phase Transitions of Random Codes and GV-Bounds Yun Fan Math Dept, CCNU A joint work with Ling, Liu, Xing Oct 2011 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

2 Phase Transitions of Random Codes and GV-Bounds 1 Phase Transitions 2 Shannon s World 3 Hamming s World 4 Gilbert-Varshamov Bound 5 Phase Transitions of Random Codes 6 Pictures for Phase Transitions of Random Codes Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

3 Phase Transitions: in Physics Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

4 Phase Transitions: in Physics SOLID melting freezing LIQUID vaporization condensation GAS 0 o C 100 o C Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

5 Phase Transitions in Mathematics: Pioneer Random graph: G(n, p) n vertices to each pare of two vertices an edge is settled at probability p Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

6 Phase Transitions in Mathematics: Pioneer Random graph: G(n, p) n vertices to each pare of two vertices an edge is settled at probability p n = 2 a a a a probability 1 p p Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

7 Phase Transitions in Mathematics: Pioneer Random graph: G(n, p) n vertices to each pare of two vertices an edge is settled at probability p n = 2 a a a a probability 1 p p n = 3 a a a a a a a a a a Ja a J probability (1 p) 3 3(1 p) 2 p 3(1 p)p 2 p 3 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

8 Phase Transitions in Mathematics: Pioneer Is G(n, p) connected? Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

9 Phase Transitions in Mathematics: Pioneer Is G(n, p) connected? DISCONNECTED ln n n p increasing p decreasing ln n n CONNECTED Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

10 Phase Transitions in Mathematics: Pioneer lim Pr ( G(n, p) connected ) = n { 0, p < ln n n ; 1, p > ln n n. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

11 Phase Transitions in Mathematics: Pioneer lim Pr ( G(n, p) connected ) = n { 0, p < ln n n ; 1, p > ln n n. published in: P. Erdös, A. Rényi, On the evolution of random graphs, Publ. Math. Inst. Hungar. Acad. Sci. vol.5, pp17-61, Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

12 Phase Transitions in Mathematics: Pioneer lim Pr ( G(n, p) connected ) = n { 0, p < ln n n ; 1, p > ln n n. published in: P. Erdös, A. Rényi, On the evolution of random graphs, Publ. Math. Inst. Hungar. Acad. Sci. vol.5, pp17-61, usage phase transition in Mathematics appeared first time in: S.Janson, T.Luczak, and A.Rucinski, The Phase Transition Ch.5 in Random Graphs, New York: Wiley, pp , Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

13 Phase Transitions in Mathematics: Linear system Random linear system S over a finite filed F : a 11 x a 1n x n = b 1 S : a m1 x a mn x n = b m Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

14 Phase Transitions in Mathematics: Linear system Random linear system S over a finite filed F : a 11 x a 1n x n = b 1 S : a m1 x a mn x n = b m lim Pr ( S has solutions ) = n { 1, r < 1; 0, r > 1. where r = m n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

15 Phase Transitions in Mathematics: Linear system Random linear system S over a finite filed F : a 11 x a 1n x n = b 1 S : a m1 x a mn x n = b m lim Pr ( S has solutions ) = n { 1, r < 1; 0, r > 1. where r = m n SOLUTION 1 r increasing r decreasing 1 NO SOLUTION Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

16 Phase Transitions in Mathematics: Linear system A proof Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

17 Phase Transitions in Mathematics: Linear system A proof A = (a ij ) m n = ( A 1 A n ), b = (b1,, b m ) T. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

18 Phase Transitions in Mathematics: Linear system A proof A = (a ij ) m n = ( A 1 A n ), b = (b1,, b m ) T. r < 1 Pr ( ranka < m ) W F m,dim W =m 1 q m (1/q) n = q (r 1)n 0. Pr ( all A j contained in W ) lim Pr ( solution ) lim Pr ( ranka = m ) = 1 n n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

19 Phase Transitions in Mathematics: Linear system A proof A = (a ij ) m n = ( A 1 A n ), b = (b1,, b m ) T. r < 1 Pr ( ranka < m ) W F m,dim W =m 1 q m (1/q) n = q (r 1)n 0. Pr ( all A j contained in W ) lim Pr ( solution ) lim Pr ( ranka = m ) = 1 n n r > 1 Pr ( solution ) = Pr ( b Span(A 1,, A n ) ) q n /q m = q (1 r)n 0. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

20 Shannon s World F : alphabet, cardinality F = q. Message word w F k Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

21 Shannon s World F : alphabet, cardinality F = q. Message word w F k encoding w E: F k F n E(w) CHANNEL E(w) + e noise e D: F n F k D ( E(w) + e ) decoding Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

22 Shannon s World F : alphabet, cardinality F = q. Message word w F k encoding w E: F k F n E(w) CHANNEL E(w) + e noise e Does D ( E(w) + e ) = w? D: F n F k D ( E(w) + e ) decoding Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

23 Shannon s World: Binary symmetric channels b b 1 p - Z p ZZ > p Z ZZ~ - 1 p b b p=false probability 1 p=true probability Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

24 Shannon s World: Binary symmetric channels b b 1 p - Z p ZZ > p Z ZZ~ - 1 p b b p=false probability 1 p=true probability Transition probability matrix: ( 1 p p ) p 1 p Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

25 Shannon s World: Binary symmetric channels b b 1 p - Z p ZZ > p Z ZZ~ - 1 p b b p=false probability 1 p=true probability Transition probability matrix: ( 1 p p ) p 1 p Entropy: H(p) = p log 2 p (1 p) log 2 (1 p) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

26 Shannon s World: q-ary symmetric channels b b H 1 p A HHHH * p q 1 HHj H A 1 p - J HHHH p q 1 JJ A H@ AA J. JJ. A AA J. JJ.. AU. b b A J AA JJ^ p = false probability 1 p = true probability transition probability matrix: p p 1 p q 1 q 1 p p q 1 1 p q 1 p p q 1 q 1 1 p Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

27 Shannon s World: q-ary symmetric channels b b H 1 p A HHHH * p q 1 HHj H A 1 p - J HHHH p q 1 JJ A H@ AA J. JJ. A AA J. JJ.. AU. b b A J AA JJ^ p = false probability 1 p = true probability transition probability matrix: p p 1 p q 1 q 1 p p q 1 1 p q 1 p p q 1 q 1 1 p q-ary Entropy: H q (p) = log q (q 1) p log q p (1 p) log q (1 p) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

28 Shannon s World: Shannon s Theorem Coding device = Rate r = k/n E : F k F n + D : F n F k Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

29 Shannon s World: Shannon s Theorem Coding device = E : F k F n + D : F n F k Rate r = k/n Shannon s Theorem If r < 1 H q (p) then there exists a coding device of rate r such that lim Pr(D( E(w) + e) = w ) = 1. n If r > 1 H q (p) then, for any coding device of rate r, lim Pr(D( E(w) + e) = w ) = 0. n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

30 Shannon s World: Shannon s Theorem c = 1 H q (p): the capacity of the q-ary symmetric channel. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

31 Shannon s World: Shannon s Theorem c = 1 H q (p): the capacity of the q-ary symmetric channel. HIGH FIDELITY c r increasing r decreasing c NO FIDELITY Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

32 Shannon s World: Shannon s Theorem C. E. Shannon, A mathematical theory of communication, Bell Sys. Tech. Journal, vol.27, pp , 623C655, If r < c, lim Pr(D( E(w) + e) = w ) = 1. n If r > c, there is a b < 1 such that lim n Pr(D( E(w) + e) = w ) < b. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

33 Shannon s World: Shannon s Theorem C. E. Shannon, A mathematical theory of communication, Bell Sys. Tech. Journal, vol.27, pp , 623C655, If r < c, lim Pr(D( E(w) + e) = w ) = 1. n If r > c, there is a b < 1 such that lim n Pr(D( E(w) + e) = w ) < b. Jacob Wolfowitz, The coding of messages subject to chancs errors, Illinois J. Math., vol.1, pp , If r > c, lim n Pr(D( E(w) + e) = w ) = 0. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

34 Hamming s World Hamming distance: d H (x, x ) = {1 i n x i x i }, x, x F n Codes: C F n rate: R(C) = log q ( C )/n, i.e. C = F R(C)n minimal distance: d H (C) = min c c C d H(c, c ) relative distance: δ H (C) = d H (C)/n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

35 Hamming s World Hamming distance: d H (x, x ) = {1 i n x i x i }, x, x F n Codes: C F n rate: R(C) = log q ( C )/n, i.e. C = F R(C)n minimal distance: d H (C) = min c c C d H(c, c ) relative distance: δ H (C) = d H (C)/n Linear codes: if F is a finite field, and C is a subspace of F n weight: w H (x) = {1 i n x i 0} minimal weight: w H (C) = min 0 c C w H(c) d H (C) = w H (C) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

36 Hamming s World: Good codes Good C: R(C) is large, and δ H (C) is large Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

37 Hamming s World: Good codes Good C: R(C) is large, and δ H (C) is large Trade off between rate and relative distance? Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

38 Hamming s World: Upper bounds Let δ 0 = 1 q 1 Given δ H (C) = δ, 0 < δ < δ 0 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

39 Hamming s World: Upper bounds Let δ 0 = 1 q 1 Given δ H (C) = δ, 0 < δ < δ 0 Many functions bound up r = R(C): Singleton bound: r 1 δ Plotkin bound: r 1 δ δ 0 Hamming bound: r 1 H q ( δ 2 ) Elias bound: r 1 H q (δ 0 δ 0 (δ 0 δ)) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

40 Hamming s World: Lower bounds When we are given δ H (C) = δ, to explore good codes, we are in fact concerned with how large r = R(C) could reach. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

41 Gilbert-Varshamov Bound: Function GV q (δ) GV q (δ) = 1 H q (δ), δ (0, δ 0 ) GV q(δ) 6 1 δ 0 - δ Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

42 Gilbert-Varshamov Bound: GV-bound Asymptotic Gilbert-Varshamov Bound For any r < GV q (δ), there exist codes C over F (of large enough length) with rate r and relative distance > δ. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

43 GV-bound: Gilbert s argument, greedy algorithm d = δn, k = rn ; M = q k B(x, d) =the Hamming ball, V q (n, d) = B(x, d) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

44 GV-bound: Gilbert s argument, greedy algorithm d = δn, k = rn ; M = q k B(x, d) =the Hamming ball, V q (n, d) = B(x, d) Take c 1 F n ; take c 2 F n B(c 1, d); Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

45 GV-bound: Gilbert s argument, greedy algorithm d = δn, k = rn ; M = q k B(x, d) =the Hamming ball, V q (n, d) = B(x, d) Take c 1 F n ; take c 2 F n B(c 1, d); once F n m B(c i, d), take c m+1 F n m B(c i, d); i=1 i=1 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

46 GV-bound: Gilbert s argument, greedy algorithm d = δn, k = rn ; M = q k B(x, d) =the Hamming ball, V q (n, d) = B(x, d) Take c 1 F n ; take c 2 F n B(c 1, d); once F n m B(c i, d), take c m+1 F n m B(c i, d); i=1 up to M B(c i, d) = F n. i=1 i=1 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

47 GV-bound: Gilbert s argument, greedy algorithm d = δn, k = rn ; M = q k B(x, d) =the Hamming ball, V q (n, d) = B(x, d) Take c 1 F n ; take c 2 F n B(c 1, d); once F n m B(c i, d), take c m+1 F n m B(c i, d); i=1 up to M B(c i, d) = F n. i=1 M V q (n, d) = M i=1 V q(n, d) i=1 M B(c i, d) = qn i=1 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

48 GV-bound: Gilbert s argument, greedy algorithm d = δn, k = rn ; M = q k B(x, d) =the Hamming ball, V q (n, d) = B(x, d) Take c 1 F n ; take c 2 F n B(c 1, d); once F n m B(c i, d), take c m+1 F n m B(c i, d); i=1 up to M B(c i, d) = F n. i=1 M V q (n, d) = M i=1 V q(n, d) i=1 M B(c i, d) = qn r = log q M/n 1 log q V q (n, d)/n 1 H q (δ) = GV q (δ) i=1 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

49 Varshamov s argument, probabilistic method Varshamov Select a linear code L of rate r < GV q (δ) uniformly at random from F n where F is a finite field, then lim Pr(δ H(L) > δ) = 1 n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

50 Varshamov s argument, probabilistic method Varshamov Select a linear code L of rate r < GV q (δ) uniformly at random from F n where F is a finite field, then lim Pr(δ H(L) > δ) = 1 n Random linear map: g : F k F n, or, random matrix G = (g ij ) m n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

51 Varshamov s argument, probabilistic method Varshamov Select a linear code L of rate r < GV q (δ) uniformly at random from F n where F is a finite field, then lim Pr(δ H(L) > δ) = 1 n Random linear map: g : F k F n, or, random matrix G = (g ij ) m n For any 0 b F k, Pr ( w H (g(b)) d ) = V q (n, d)/q n q GVq(δ)n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

52 Varshamov s argument, probabilistic method Varshamov Select a linear code L of rate r < GV q (δ) uniformly at random from F n where F is a finite field, then lim Pr(δ H(L) > δ) = 1 n Random linear map: g : F k F n, or, random matrix G = (g ij ) m n For any 0 b F k, Pr ( w H (g(b)) d ) = V q (n, d)/q n q GVq(δ)n Pr(δ H (C) δ) = Pr (w H (g(b)) d) 0 b F k Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

53 Varshamov s argument, probabilistic method Varshamov Select a linear code L of rate r < GV q (δ) uniformly at random from F n where F is a finite field, then lim Pr(δ H(L) > δ) = 1 n Random linear map: g : F k F n, or, random matrix G = (g ij ) m n For any 0 b F k, Pr ( w H (g(b)) d ) = V q (n, d)/q n q GVq(δ)n Pr(δ H (C) δ) = Pr (w H (g(b)) d) 0 b F k Pr(δ H (C) δ) q k V q (n, d)/q n q rn GVq(δ)n 0 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

54 GV-Bound: Comparison with Shannon s world Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

55 GV-Bound: Comparison with Shannon s world q-ary GV-bound coincides exactly with the Shannon s capacity of q-ary symmetric channels Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

56 GV-Bound: Comparison with Shannon s world q-ary GV-bound coincides exactly with the Shannon s capacity of q-ary symmetric channels Varshamov s argument deals with random objects by means of probabilistic methods Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

57 GV-Bound: Comparison with Shannon s world q-ary GV-bound coincides exactly with the Shannon s capacity of q-ary symmetric channels Varshamov s argument deals with random objects by means of probabilistic methods What happen if r is beyond GV-bound? Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

58 GV-Bound: Beyond GV-bound Codes of relative distance > δ and rate r beyond GV-bound? Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

59 GV-Bound: Beyond GV-bound Codes of relative distance > δ and rate r beyond GV-bound? Famous work: Yes if q 49, algebraic geometry codes M. A. Tsfasman, S.G. Vladuts, T. Zink, Modular curves, Shimura curves and Goppa codes, better than Varshamov-Gilbert bound, Math. Nachrichten, vol.104, pp.13-28, Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

60 GV-Bound: Beyond GV-bound Codes of relative distance > δ and rate r beyond GV-bound? Famous work: Yes if q 49, algebraic geometry codes M. A. Tsfasman, S.G. Vladuts, T. Zink, Modular curves, Shimura curves and Goppa codes, better than Varshamov-Gilbert bound, Math. Nachrichten, vol.104, pp.13-28, For q = 2, no code C with δ H (C) > δ and rate R(C) > GV q (δ) is reported Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

61 GV-Bound: Beyond GV-bound Codes of relative distance > δ and rate r beyond GV-bound? Famous work: Yes if q 49, algebraic geometry codes M. A. Tsfasman, S.G. Vladuts, T. Zink, Modular curves, Shimura curves and Goppa codes, better than Varshamov-Gilbert bound, Math. Nachrichten, vol.104, pp.13-28, For q = 2, no code C with δ H (C) > δ and rate R(C) > GV q (δ) is reported A guess: no code of rel. dist. > δ and rate > GV q (δ) for q = 2 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

62 GV-Bound: Arbitrary random codes F : an alphabet C: random code of rate r of F n (selected uniformly at random from F n ) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

63 GV-Bound: Arbitrary random codes F : an alphabet C: random code of rate r of F n (selected uniformly at random from F n ) lim Pr(δ H(C) > δ) = 1, n if r < 1 2 GV q(δ) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

64 GV-Bound: Arbitrary random codes F : an alphabet C: random code of rate r of F n (selected uniformly at random from F n ) lim Pr(δ H(C) > δ) = 1, n if r < 1 2 GV q(δ) It follows from: Alexander Barg, G. David Forney, Random codes: Minimum distances and error exponents, IEEE Trans. Inform. Theory, vol.48, pp , Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

65 Phase Transitions of Random Codes: What we do?? F : finite field L: random linear code of rate r of F n lim Pr(δ H(L) > δ) = n { 1, if r < GVq (δ) ;??, if r > GV q (δ). Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

66 Phase Transitions of Random Codes: What we do?? F : finite field L: random linear code of rate r of F n lim Pr(δ H(L) > δ) = n F : alphabet C: random code of rate r of F n lim Pr(δ H(C) > δ) = n { 1, if r < GVq (δ) ;??, if r > GV q (δ). { 1, if r < 1 2 GV q(δ) ;??, if r > 1 2 GV q(δ). Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

67 Phase Transitions of Random Codes: Linear case F : finite field L: random linear code of rate r of F n Our Result lim Pr(δ H(L) > δ) = n { 1, if r < GVq (δ) ; 0, if r > GV q (δ). Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

68 Phase Transitions of Random Codes: Linear case F : finite field L: random linear code of rate r of F n Our Result lim Pr(δ H(L) > δ) = n { 1, if r < GVq (δ) ; 0, if r > GV q (δ). Is δ H (L) > δ? YES GV q (δ) r increasing r decreasing GV q (δ) NO Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

69 Linea case: Sketch of proof Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

70 Linea case: Sketch of proof Random linear map g : F k F n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

71 Linea case: Sketch of proof Random linear map Random variables X b = g : F k F n { 1, w H ( g(b) ) δn; 0, otherwise. b F k. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

72 Linea case: Sketch of proof Random linear map Random variables X b = g : F k F n { 1, w H ( g(b) ) δn; 0, otherwise. b F k. Random variable X = 0 b F k X b. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

73 Linea case: Sketch of proof Random linear map Random variables X b = g : F k F n { 1, w H ( g(b) ) δn; 0, otherwise. b F k. Random variable X = 0 b F k X b. Expectation E(X b ) = V q (n, δn)/q n = V q (n, δn)/q n q GVq(δ)n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

74 Linea case: Sketch of proof Random linear map Random variables X b = g : F k F n { 1, w H ( g(b) ) δn; 0, otherwise. b F k. Random variable X = 0 b F k X b. Expectation E(X b ) = V q (n, δn)/q n = V q (n, δn)/q n q GVq(δ)n Expectation E(X ) = (q k 1)E(X b ) { 0, r < GV q (δ);, r > GV q (δ). Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

75 Linea case: Sketch of proof: Case r < GV q (δ) Case r < GV q (δ) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

76 Linea case: Sketch of proof: Case r < GV q (δ) Case r < GV q (δ) By Markov s inequality, lim Pr(X 1) lim E(X ) = 0 n n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

77 Linea case: Sketch of proof: Case r < GV q (δ) Case r < GV q (δ) By Markov s inequality, lim Pr(X 1) lim E(X ) = 0 n n so lim Pr ( δ H (L) > δ ) = lim Pr(X = 0) = 1 n n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

78 Linea case: Sketch of proof: Case r > GV q (δ) Case r > GV q (δ) Second moment method: Pr(X 1) E(X b ) E(X X b = 1) 0 b F k Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

79 Linea case: Sketch of proof: Case r > GV q (δ) Case r > GV q (δ) Second moment method: Pr(X 1) E(X b ) E(X X b = 1) 0 b F k we can compute that E(X X b = 1) = (q k q)e(x b ) + (q 1) Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

80 Linea case: Sketch of proof: Case r > GV q (δ) Case r > GV q (δ) Second moment method: we can compute that Pr(X 1) E(X b ) E(X X b = 1) 0 b F k E(X X b = 1) = (q k q)e(x b ) + (q 1) from that lim n qk E(X b ) = lim E(X ) =, we have n q k E(X b ) lim Pr(X 1) lim n n (q k q)e(x b ) + (q 1) = 1 Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

81 Linea case: Sketch of proof: Case r > GV q (δ) Case r > GV q (δ) Second moment method: we can compute that Pr(X 1) E(X b ) E(X X b = 1) 0 b F k E(X X b = 1) = (q k q)e(x b ) + (q 1) from that lim n qk E(X b ) = lim E(X ) =, we have n so q k E(X b ) lim Pr(X 1) lim n n (q k q)e(x b ) + (q 1) = 1 lim Pr ( δ H (L) > δ ) = lim Pr(X = 0) = 0 n n Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

82 Phase transitions of Random Codes: Arbitrary case F : an alphabet C: random code of rate r of F n Our Result lim Pr(δ H(C) > δ) = n 1, if r < 1 2 GV q(δ) ; 0, if r > 1 2 GV q(δ). Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

83 Phase transitions of Random Codes: Arbitrary case F : an alphabet C: random code of rate r of F n Our Result lim Pr(δ H(C) > δ) = n 1, if r < 1 2 GV q(δ) ; 0, if r > 1 2 GV q(δ). Is δ H (C) > δ? YES 1 2 GV q(δ) r increasing r decreasing 1 2 GV q(δ) NO Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

84 Pictures for Phase Transitions: Linear case Linear case: random linear code L p 6 1 GV q(δ) - r Figure: p = Pr ( δ H (L) > δ ) ; 0 < δ < δ 0. Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

85 Pictures for Phase Transitions: Linear case Linear case: random linear code L r 6 1 GV-bound I: upper bound II: small large probability III: zero probability probability δ 0 - δ Figure: Three areas for the probability of the event δ H (L) > δ Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

86 Pictures for Phase Transitions: Linear case p 6 p = Pr(δ H (L) > δ) - r δ r = GV q (δ) is a phase transition curve Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

87 Pictures for Phase Transitions: Arbitrary case Arbitrary case: arbitrary random code C r 6 1 upper bound + GV-bound + I II II II III half GV-bound δ 0 - δ I: probability 1 II: probability 0 II : greedy algorithm works II : greedy algorithm does t work III: probability = 0 Figure: Three (four) areas for the probability of the event δ H (C) > δ Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

88 Phase Transitions of Random Codes and GV-Bounds THANK YOU Y. Fan (CCNU) Phase Transitions of Random Codes and GV-Bounds Oct / 36

Lecture 19: Elias-Bassalygo Bound

Lecture 19: Elias-Bassalygo Bound Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecturer: Atri Rudra Lecture 19: Elias-Bassalygo Bound October 10, 2007 Scribe: Michael Pfetsch & Atri Rudra In the last lecture,

More information

Error-Correcting Codes:

Error-Correcting Codes: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT Communication in presence of noise We are not ready Sender Noisy Channel We are now ready Receiver If information is digital, reliability

More information

A Characterization Of Quantum Codes And Constructions

A Characterization Of Quantum Codes And Constructions A Characterization Of Quantum Codes And Constructions Chaoping Xing Department of Mathematics, National University of Singapore Singapore 117543, Republic of Singapore (email: matxcp@nus.edu.sg) Abstract

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Limits to List Decoding Random Codes

Limits to List Decoding Random Codes Limits to List Decoding Random Codes Atri Rudra Department of Computer Science and Engineering, University at Buffalo, The State University of New York, Buffalo, NY, 14620. atri@cse.buffalo.edu Abstract

More information

Sphere Packing and Shannon s Theorem

Sphere Packing and Shannon s Theorem Chapter 2 Sphere Packing and Shannon s Theorem In the first section we discuss the basics of block coding on the m-ary symmetric channel. In the second section we see how the geometry of the codespace

More information

Proof: Let the check matrix be

Proof: Let the check matrix be Review/Outline Recall: Looking for good codes High info rate vs. high min distance Want simple description, too Linear, even cyclic, plausible Gilbert-Varshamov bound for linear codes Check matrix criterion

More information

Asymptotically good sequences of curves and codes

Asymptotically good sequences of curves and codes Asymptotically good sequences of curves and codes Ruud Pellikaan Appeared in in Proc 34th Allerton Conf on Communication, Control, and Computing, Urbana-Champaign, October 2-4, 1996, 276-285 1 Introduction

More information

Error-correcting Pairs for a Public-key Cryptosystem

Error-correcting Pairs for a Public-key Cryptosystem Error-correcting Pairs for a Public-key Cryptosystem Ruud Pellikaan g.r.pellikaan@tue.nl joint work with Irene Márquez-Corbella Code-based Cryptography Workshop 2012 Lyngby, 9 May 2012 Introduction and

More information

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y

More information

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x) Outline Recall: For integers Euclidean algorithm for finding gcd s Extended Euclid for finding multiplicative inverses Extended Euclid for computing Sun-Ze Test for primitive roots And for polynomials

More information

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class Error Correcting Codes: Combinatorics, Algorithms and Applications Spring 2009 Homework Due Monday March 23, 2009 in class You can collaborate in groups of up to 3. However, the write-ups must be done

More information

An Extended Fano s Inequality for the Finite Blocklength Coding

An Extended Fano s Inequality for the Finite Blocklength Coding An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

ON WEIERSTRASS SEMIGROUPS AND THE REDUNDANCY OF IMPROVED GEOMETRIC GOPPA CODES

ON WEIERSTRASS SEMIGROUPS AND THE REDUNDANCY OF IMPROVED GEOMETRIC GOPPA CODES ON WEIERSTRASS SEMIGROUPS AND THE REDUNDANCY OF IMPROVED GEOMETRIC GOPPA CODES RUUD PELLIKAAN AND FERNANDO TORRES Appeared in: IEEE Trans. Inform. Theory, vol. 45, pp. 2512-2519, Nov. 1999 Abstract. Improved

More information

Lecture 4: Codes based on Concatenation

Lecture 4: Codes based on Concatenation Lecture 4: Codes based on Concatenation Error-Correcting Codes (Spring 206) Rutgers University Swastik Kopparty Scribe: Aditya Potukuchi and Meng-Tsung Tsai Overview In the last lecture, we studied codes

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

Lecture 6: Expander Codes

Lecture 6: Expander Codes CS369E: Expanders May 2 & 9, 2005 Lecturer: Prahladh Harsha Lecture 6: Expander Codes Scribe: Hovav Shacham In today s lecture, we will discuss the application of expander graphs to error-correcting codes.

More information

What s New and Exciting in Algebraic and Combinatorial Coding Theory?

What s New and Exciting in Algebraic and Combinatorial Coding Theory? What s New and Exciting in Algebraic and Combinatorial Coding Theory? Alexander Vardy University of California San Diego vardy@kilimanjaro.ucsd.edu Notice: Persons attempting to find anything useful in

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

List Decoding of Reed Solomon Codes

List Decoding of Reed Solomon Codes List Decoding of Reed Solomon Codes p. 1/30 List Decoding of Reed Solomon Codes Madhu Sudan MIT CSAIL Background: Reliable Transmission of Information List Decoding of Reed Solomon Codes p. 2/30 List Decoding

More information

Asymptotic Improvement of the Gilbert Varshamov Bound on the Size of Binary Codes

Asymptotic Improvement of the Gilbert Varshamov Bound on the Size of Binary Codes IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 8, AUGUST 2004 1655 Asymptotic Improvement of the Gilbert Varshamov Bound on the Size of Binary Codes Tao Jiang and Alexander Vardy, Fellow, IEEE Abstract

More information

arxiv: v1 [cs.it] 5 Sep 2008

arxiv: v1 [cs.it] 5 Sep 2008 1 arxiv:0809.1043v1 [cs.it] 5 Sep 2008 On Unique Decodability Marco Dalai, Riccardo Leonardi Abstract In this paper we propose a revisitation of the topic of unique decodability and of some fundamental

More information

Coding problems for memory and storage applications

Coding problems for memory and storage applications .. Coding problems for memory and storage applications Alexander Barg University of Maryland January 27, 2015 A. Barg (UMD) Coding for memory and storage January 27, 2015 1 / 73 Codes with locality Introduction:

More information

Codes and Complexity. Matilde Marcolli. Centre for Complex Systems Studies Utrecht, April 2018

Codes and Complexity. Matilde Marcolli. Centre for Complex Systems Studies Utrecht, April 2018 Centre for Complex Systems Studies Utrecht, April 2018 This lecture is based on: Yuri I. Manin,, Error-correcting codes and phase transitions, Mathematics in Computer Science (2011) 5:133 170. Yuri I.

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

LET be the finite field of cardinality and let

LET be the finite field of cardinality and let 128 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 1, JANUARY 1997 An Explicit Construction of a Sequence of Codes Attaining the Tsfasman Vlăduţ Zink Bound The First Steps Conny Voss Tom Høholdt,

More information

Matrix characterization of linear codes with arbitrary Hamming weight hierarchy

Matrix characterization of linear codes with arbitrary Hamming weight hierarchy Linear Algebra and its Applications 412 (2006) 396 407 www.elsevier.com/locate/laa Matrix characterization of linear codes with arbitrary Hamming weight hierarchy G. Viswanath, B. Sundar Rajan Department

More information

Asymptotically good sequences of codes and curves

Asymptotically good sequences of codes and curves Asymptotically good sequences of codes and curves Ruud Pellikaan Technical University of Eindhoven Soria Summer School on Computational Mathematics July 9, 2008 /k 1/29 Content: 8 Some algebraic geometry

More information

Arrangements, matroids and codes

Arrangements, matroids and codes Arrangements, matroids and codes first lecture Ruud Pellikaan joint work with Relinde Jurrius ACAGM summer school Leuven Belgium, 18 July 2011 References 2/43 1. Codes, arrangements and matroids by Relinde

More information

IMPROVING THE ALPHABET-SIZE IN EXPANDER BASED CODE CONSTRUCTIONS

IMPROVING THE ALPHABET-SIZE IN EXPANDER BASED CODE CONSTRUCTIONS IMPROVING THE ALPHABET-SIZE IN EXPANDER BASED CODE CONSTRUCTIONS 1 Abstract Various code constructions use expander graphs to improve the error resilience. Often the use of expanding graphs comes at the

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify) CS5314 Randomized Algorithms Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify) 1 Introduce two topics: De-randomize by conditional expectation provides a deterministic way to construct

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

ELEC 519A Selected Topics in Digital Communications: Information Theory. Hamming Codes and Bounds on Codes

ELEC 519A Selected Topics in Digital Communications: Information Theory. Hamming Codes and Bounds on Codes ELEC 519A Selected Topics in Digital Communications: Information Theory Hamming Codes and Bounds on Codes Single Error Correcting Codes 2 Hamming Codes (7,4,3) Hamming code 1 0 0 0 0 1 1 0 1 0 0 1 0 1

More information

Notes for the Hong Kong Lectures on Algorithmic Coding Theory. Luca Trevisan. January 7, 2007

Notes for the Hong Kong Lectures on Algorithmic Coding Theory. Luca Trevisan. January 7, 2007 Notes for the Hong Kong Lectures on Algorithmic Coding Theory Luca Trevisan January 7, 2007 These notes are excerpted from the paper Some Applications of Coding Theory in Computational Complexity [Tre04].

More information

Lecture 7: February 6

Lecture 7: February 6 CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 7: February 6 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

A Combinatorial Bound on the List Size

A Combinatorial Bound on the List Size 1 A Combinatorial Bound on the List Size Yuval Cassuto and Jehoshua Bruck California Institute of Technology Electrical Engineering Department MC 136-93 Pasadena, CA 9115, U.S.A. E-mail: {ycassuto,bruck}@paradise.caltech.edu

More information

A multiple access system for disjunctive vector channel

A multiple access system for disjunctive vector channel Thirteenth International Workshop on Algebraic and Combinatorial Coding Theory June 15-21, 2012, Pomorie, Bulgaria pp. 269 274 A multiple access system for disjunctive vector channel Dmitry Osipov, Alexey

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

Chapter 2 Review of Classical Information Theory

Chapter 2 Review of Classical Information Theory Chapter 2 Review of Classical Information Theory Abstract This chapter presents a review of the classical information theory which plays a crucial role in this thesis. We introduce the various types of

More information

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse

More information

Some Basic Concepts of Probability and Information Theory: Pt. 2

Some Basic Concepts of Probability and Information Theory: Pt. 2 Some Basic Concepts of Probability and Information Theory: Pt. 2 PHYS 476Q - Southern Illinois University January 22, 2018 PHYS 476Q - Southern Illinois University Some Basic Concepts of Probability and

More information

Lecture 17: Perfect Codes and Gilbert-Varshamov Bound

Lecture 17: Perfect Codes and Gilbert-Varshamov Bound Lecture 17: Perfect Codes and Gilbert-Varshamov Bound Maximality of Hamming code Lemma Let C be a code with distance 3, then: C 2n n + 1 Codes that meet this bound: Perfect codes Hamming code is a perfect

More information

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9 Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University

More information

Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes

Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes 1 Zheng Wang, Student Member, IEEE, Jie Luo, Member, IEEE arxiv:0808.3756v1 [cs.it] 27 Aug 2008 Abstract We show that

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST Information Theory Lecture 5 Entropy rate and Markov sources STEFAN HÖST Universal Source Coding Huffman coding is optimal, what is the problem? In the previous coding schemes (Huffman and Shannon-Fano)it

More information

The Hamming Codes and Delsarte s Linear Programming Bound

The Hamming Codes and Delsarte s Linear Programming Bound The Hamming Codes and Delsarte s Linear Programming Bound by Sky McKinley Under the Astute Tutelage of Professor John S. Caughman, IV A thesis submitted in partial fulfillment of the requirements for the

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Construction of Asymptotically Good Low-Rate Error-Correcting Codes through Pseudo-Random Graphs

Construction of Asymptotically Good Low-Rate Error-Correcting Codes through Pseudo-Random Graphs Construction of Asymptotically Good Low-Rate Error-Correcting Codes through Pseudo-Random Graphs Noga Alon Jehoshua Bruck Joseph Naor Moni Naor Ron M. Roth Abstract A new technique, based on the pseudo-random

More information

5. Density evolution. Density evolution 5-1

5. Density evolution. Density evolution 5-1 5. Density evolution Density evolution 5-1 Probabilistic analysis of message passing algorithms variable nodes factor nodes x1 a x i x2 a(x i ; x j ; x k ) x3 b x4 consider factor graph model G = (V ;

More information

Lecture 19 : Reed-Muller, Concatenation Codes & Decoding problem

Lecture 19 : Reed-Muller, Concatenation Codes & Decoding problem IITM-CS6845: Theory Toolkit February 08, 2012 Lecture 19 : Reed-Muller, Concatenation Codes & Decoding problem Lecturer: Jayalal Sarma Scribe: Dinesh K Theme: Error correcting codes In the previous lecture,

More information

RUUD PELLIKAAN, HENNING STICHTENOTH, AND FERNANDO TORRES

RUUD PELLIKAAN, HENNING STICHTENOTH, AND FERNANDO TORRES Appeared in: Finite Fields and their Applications, vol. 4, pp. 38-392, 998. WEIERSTRASS SEMIGROUPS IN AN ASYMPTOTICALLY GOOD TOWER OF FUNCTION FIELDS RUUD PELLIKAAN, HENNING STICHTENOTH, AND FERNANDO TORRES

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

Improving the Alphabet Size in Expander Based Code Constructions

Improving the Alphabet Size in Expander Based Code Constructions Tel Aviv University Raymond and Beverly Sackler Faculty of Exact Sciences School of Computer Sciences Improving the Alphabet Size in Expander Based Code Constructions Submitted as a partial fulfillment

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

An Introduction to (Network) Coding Theory

An Introduction to (Network) Coding Theory An to (Network) Anna-Lena Horlemann-Trautmann University of St. Gallen, Switzerland April 24th, 2018 Outline 1 Reed-Solomon Codes 2 Network Gabidulin Codes 3 Summary and Outlook A little bit of history

More information

Chaos, Complexity, and Inference (36-462)

Chaos, Complexity, and Inference (36-462) Chaos, Complexity, and Inference (36-462) Lecture 7: Information Theory Cosma Shalizi 3 February 2009 Entropy and Information Measuring randomness and dependence in bits The connection to statistics Long-run

More information

Generalized hashing and applications to digital fingerprinting

Generalized hashing and applications to digital fingerprinting Generalized hashing and applications to digital fingerprinting Noga Alon, Gérard Cohen, Michael Krivelevich and Simon Litsyn Abstract Let C be a code of length n over an alphabet of q letters. An n-word

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

Small subgraphs of random regular graphs

Small subgraphs of random regular graphs Discrete Mathematics 307 (2007 1961 1967 Note Small subgraphs of random regular graphs Jeong Han Kim a,b, Benny Sudakov c,1,vanvu d,2 a Theory Group, Microsoft Research, Redmond, WA 98052, USA b Department

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

FUNDAMENTALS OF ERROR-CORRECTING CODES - NOTES. Presenting: Wednesday, June 8. Section 1.6 Problem Set: 35, 40, 41, 43

FUNDAMENTALS OF ERROR-CORRECTING CODES - NOTES. Presenting: Wednesday, June 8. Section 1.6 Problem Set: 35, 40, 41, 43 FUNDAMENTALS OF ERROR-CORRECTING CODES - NOTES BRIAN BOCKELMAN Monday, June 6 2005 Presenter: J. Walker. Material: 1.1-1.4 For Lecture: 1,4,5,8 Problem Set: 6, 10, 14, 17, 19. 1. Basic concepts of linear

More information

CH 61 USING THE GCF IN EQUATIONS AND FORMULAS

CH 61 USING THE GCF IN EQUATIONS AND FORMULAS CH 61 USING THE GCF IN EQUATIONS AND FORMULAS Introduction A while back we studied the Quadratic Formula and used it to solve quadratic equations such as x 5x + 6 = 0; we were also able to solve rectangle

More information

Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families

Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families entropy Article Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families Matilde Marcolli Department of Mathematics, California Institute of Technology, Pasadena,

More information

The expansion of random regular graphs

The expansion of random regular graphs The expansion of random regular graphs David Ellis Introduction Our aim is now to show that for any d 3, almost all d-regular graphs on {1, 2,..., n} have edge-expansion ratio at least c d d (if nd is

More information

Error Correcting Codes Questions Pool

Error Correcting Codes Questions Pool Error Correcting Codes Questions Pool Amnon Ta-Shma and Dean Doron January 3, 018 General guidelines The questions fall into several categories: (Know). (Mandatory). (Bonus). Make sure you know how to

More information

On the Optimum Asymptotic Multiuser Efficiency of Randomly Spread CDMA

On the Optimum Asymptotic Multiuser Efficiency of Randomly Spread CDMA On the Optimum Asymptotic Multiuser Efficiency of Randomly Spread CDMA Ralf R. Müller Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) Lehrstuhl für Digitale Übertragung 13 December 2014 1. Introduction

More information

P. Fenwick [5] has recently implemented a compression algorithm for English text of this type, with good results. EXERCISE. For those of you that know

P. Fenwick [5] has recently implemented a compression algorithm for English text of this type, with good results. EXERCISE. For those of you that know Six Lectures on Information Theory John Kieer 2 Prediction & Information Theory Prediction is important in communications, control, forecasting, investment, and other areas. When the data model is known,

More information

Independence and chromatic number (and random k-sat): Sparse Case. Dimitris Achlioptas Microsoft

Independence and chromatic number (and random k-sat): Sparse Case. Dimitris Achlioptas Microsoft Independence and chromatic number (and random k-sat): Sparse Case Dimitris Achlioptas Microsoft Random graphs W.h.p.: with probability that tends to 1 as n. Hamiltonian cycle Let τ 2 be the moment all

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

An Introduction to Low Density Parity Check (LDPC) Codes

An Introduction to Low Density Parity Check (LDPC) Codes An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,

More information

An Upper Bound On the Size of Locally. Recoverable Codes

An Upper Bound On the Size of Locally. Recoverable Codes An Upper Bound On the Size of Locally 1 Recoverable Codes Viveck Cadambe Member, IEEE and Arya Mazumdar Member, IEEE arxiv:1308.3200v2 [cs.it] 26 Mar 2015 Abstract In a locally recoverable or repairable

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

Some Aspects of Finite State Channel related to Hidden Markov Process

Some Aspects of Finite State Channel related to Hidden Markov Process Some Aspects of Finite State Channel related to Hidden Markov Process Kingo Kobayashi National Institute of Information and Communications Tecnology(NICT), 4-2-1, Nukui-Kitamachi, Koganei, Tokyo, 184-8795,

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Lecture 03: Polynomial Based Codes

Lecture 03: Polynomial Based Codes Lecture 03: Polynomial Based Codes Error-Correcting Codes (Spring 016) Rutgers University Swastik Kopparty Scribes: Ross Berkowitz & Amey Bhangale 1 Reed-Solomon Codes Reed Solomon codes are large alphabet

More information

Section 3 Error Correcting Codes (ECC): Fundamentals

Section 3 Error Correcting Codes (ECC): Fundamentals Section 3 Error Correcting Codes (ECC): Fundamentals Communication systems and channel models Definition and examples of ECCs Distance For the contents relevant to distance, Lin & Xing s book, Chapter

More information

FAMILIES OF LOCALLY SEPARATED HAMILTON PATHS

FAMILIES OF LOCALLY SEPARATED HAMILTON PATHS arxiv:1411.3902v1 [math.co] 14 Nov 2014 FAMILIES OF LOCALLY SEPARATED HAMILTON PATHS János Körner e mail: korner@di.uniroma1.it Angelo Monti e mail: Sapienza University of Rome ITALY

More information

Binary Puzzles as an Erasure Decoding Problem

Binary Puzzles as an Erasure Decoding Problem Binary Puzzles as an Erasure Decoding Problem Putranto Hadi Utomo Ruud Pellikaan Eindhoven University of Technology Dept. of Math. and Computer Science PO Box 513. 5600 MB Eindhoven p.h.utomo@tue.nl g.r.pellikaan@tue.nl

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Strong converse theorems using Rényi entropies

Strong converse theorems using Rényi entropies Strong converse theorems using Rényi entropies Felix Leditzky joint work with Mark M. Wilde and Nilanjana Datta arxiv:1506.02635 5 January 2016 Table of Contents 1 Weak vs. strong converse 2 Rényi entropies

More information