Layered Error-Correction Codes for Real-Time Streaming over Erasure Channels

Size: px
Start display at page:

Download "Layered Error-Correction Codes for Real-Time Streaming over Erasure Channels"

Transcription

1 Layered Error-Correction Codes for Real-Time Streaming over Erasure Channels Ashish Khisti University of Toronto Joint Work: Ahmed Badr (Toronto), Wai-Tian Tan (Cisco), John Apostolopoulos (Cisco). Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013

2 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

3 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

4 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

5 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

6 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

7 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

8 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

9 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

10 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

11 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

12 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

13 Real-Time Communication System Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

14 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

15 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

16 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

17 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

18 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

19 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

20 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: Recover s 0, s 1, s 2, s 3 H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

21 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: Recover s 0, s 1, s 2, s 3 H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

22 Motivating Example k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H s i M H M, Erasure Codes: Recover s 0, s 1, s 2, s 3 H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) p 4 H 4 H 3 H 2 H 1 s 0 p 5 p 6 = H 5 H 4 H 3 H 2 s 1 0 H 5 H 4 H 3 s 2 p H 5 H 4 s 3 }{{} full rank Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

23 Motivating Example- II B = 4, T = 8 Rate 1/2 Baseline Erasure Codes, T = 7 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 v 8 v 9 v 10 v 11 p 8 p 9 p 10 p 11 x i v 0,v 1,v 2,v 3 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

24 Motivating Example- II B = 4, T = 8 Rate 1/2 Baseline Erasure Codes, T = 7 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 v 8 v 9 v 10 v 11 p 8 p 9 p 10 p 11 x i Rate 1/2 Repetition Code, T = 8 v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u -8 u -7 u -6 u -5 u -4 u -3 u -2 u -1 u 8 u 9 u 10 u 11 u 0 u 1 u 2 u 3 x i u 0 u 1 u 2 u 3 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

25 Motivating Example - II B = 4, T = 8 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 v u u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u -8 u -7 u -6 u -5 u -4 u -3 u -2 u -1 u 0 u 1 u 2 u 3 u u Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

26 Motivating Example - II B = 4, T = 8 u 0 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 u -8 u 1 u -7 u 2 u -6 u 3 u -5 u 4 u -4 u 5 u -3 u 6 u -2 u 7 u -1 u 8 u 0 u 9 u 1 u 10 u 2 u 11 p 11 u 3 u v u u R= u+v 3u+v = 1 2 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

27 Motivating Example - II B = 4, T = 8 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u+v 2 u+v = 2 3 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

28 Motivating Example - II B = 4, T = 8 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u+v 2 u+v = 2 3 Encoding: 1 Split each source symbol into 2 groups s i = (u i, v i ) 2 Apply Erasure code to the v i stream generating p i parities 3 Repeat the u i symbols with a shift of T 4 Combine the repeated u i s with the p i s Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

29 Motivating Example - II B = 4, T = 8 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u+v 2 u+v = 2 3 v 0,v 1,v 2,v 3 Encoding: 1 Split each source symbol into 2 groups s i = (u i, v i ) 2 Apply Erasure code to the v i stream generating p i parities 3 Repeat the u i symbols with a shift of T 4 Combine the repeated u i s with the p i s Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

30 Motivating Example - II B = 4, T = 8 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u v 2 u v v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 3 Encoding: 1 Split each source symbol into 2 groups s i = (u i, v i ) 2 Apply Erasure code to the v i stream generating p i parities 3 Repeat the u i symbols with a shift of T 4 Combine the repeated u i s with the p i s Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

31 Motivating Example - II B = 4, T = 8 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u+v 2 u+v = 2 3 v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 3 Encoding: s 0 s 1 s 2 s 3 1 Split each source symbol into 2 groups s i = (u i, v i ) 2 Apply Erasure code to the v i stream generating p i parities 3 Repeat the u i symbols with a shift of T 4 Combine the repeated u i s with the p i s Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

32 Motivating Example - II B = 4, T = 8 s i v u 0 p v 0 v u 1 p v 1 v u 2 p v 2 v u 3 p v 3 v u 4 p v 4 v u 5 p v 5 v u 6 p v 6 v u 7 7 p v 7 7 v u 8 p v 8 v u 9 p v 9 v u 10 p v 10 v u 11 p v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 u -8 u -7 u -6 u -5 +u u -4-4 u -3 u 0 u 1 R= u u+v u v 2 u 3 u 4 u 5 u 6 u u+v 1 2 v u v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 2 = u -8 u -7 u -6 u -5 u -4 u -3 u -2 u -1 u 0 u 1 u 2 Encoding: u -2 u -1 u 0 u 1 u 2 u 3 u 3 s 0 s 1 s 2 s 3 uv vu u u u u x i 1 Split each source symbol into 2 groups s i = (u i, v i ) 2 Apply Erasure code to the v i stream generating p i parities 3 Repeat the u i symbols with a shift of T 4 Combine the repeated u i s with the p i s 5 Maximally Short Codes (Martinian and Sundberg 04, Martinian and Trott 07) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

33 Key Questions Can we construct streaming codes for realistic channels? Bad State Good State Bad State Gilbert-Elliott Model How much performance gains can we obtain? What are the fundamental metrics for low-delay error correction codes? Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

34 Problem Setup System Model Source Model : i.i.d. sequence s[t] p s ( )= Unif{(F q ) k } Streaming Encoder: x[t] = f t (s[1],..., s[t]), x[t] (F q ) n Erasure Channel Delay-Constrained Decoder: ŝ[t] = g t (y[1],..., y[t + T ]) Rate R = k n Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

35 Problem Setup System Model Source Model : i.i.d. sequence s[t] p s ( )= Unif{(F q ) k } Streaming Encoder: x[t] = f t (s[1],..., s[t]), x[t] (F q ) n Erasure Channel Delay-Constrained Decoder: ŝ[t] = g t (y[1],..., y[t + T ]) Rate R = k n (N,B,W) = (2,3,6) W = W 6= 6 N = N 2= 2 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

36 Problem Setup System Model Source Model : i.i.d. sequence s[t] p s ( )= Unif{(F q ) k } Streaming Encoder: x[t] = f t (s[1],..., s[t]), x[t] (F q ) n Erasure Channel Delay-Constrained Decoder: ŝ[t] = g t (y[1],..., y[t + T ]) Rate R = k n (N,B,W) = (2,3,6) W = W 6= W 6= 6 N = N 2= N 2= 2 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

37 Problem Setup System Model Source Model : i.i.d. sequence s[t] p s ( )= Unif{(F q ) k } Streaming Encoder: x[t] = f t (s[1],..., s[t]), x[t] (F q ) n Erasure Channel Delay-Constrained Decoder: ŝ[t] = g t (y[1],..., y[t + T ]) Rate R = k n (N,B,W) = (2,3,6) W = W 6= W 6= W 6= 6 N = N 2= N 2= B 2= 3 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

38 Problem Setup System Model Source Model : i.i.d. sequence s[t] p s ( )= Unif{(F q ) k } Streaming Encoder: x[t] = f t (s[1],..., s[t]), x[t] (F q ) n Erasure Channel Delay-Constrained Decoder: ŝ[t] = g t (y[1],..., y[t + T ]) Rate R = k n (N,B,W) = (2,3,6) W = W 6= W 6= W 6= W 6= 6 N = N 2= N 2= B 2= B 3= 3 Capacity: C(N, B, W, T ) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

39 Streaming Codes Capacity Badr-Khisti-Tan-Apostolopoulos (2013) Theorem Consider the C(N, B, W ) channel, with W B + 1, and let the delay be T. Upper-Bound For any rate R code, we have: ( R 1 R ) B + N min(w, T + 1) Lower-Bound: There exists a rate R code that satisfies: ( ) R B + N min(w, T + 1) 1. 1 R The gap between the upper and lower bound is 1 unit of delay. Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

40 Streaming Codes - Burst Erasure Channels C(N = 1, B, W T + 1) = T T +B s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 4 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -4 B T-B B x i B T-B...v 3 v 0 Assume s i F T q. Let v i F T q B, u i, p i F B q Recovery of {v i } by time T 1 Number of Unknowns: B symbols over Fq T B Number of Equations: T B symbols over F B q Recovery of u i at time i + T Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

41 Streaming Codes - Burst Erasure Channels C(N = 1, B, W T + 1) = T T +B s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 4 +u -4 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 B T-B B x i B T-B v 0...v 3 u 0 u 1 u 2 u 3 Assume s i F T q. Let v i F T q B, u i, p i F B q Recovery of {v i } by time T 1 Number of Unknowns: B symbols over Fq T B Number of Equations: T B symbols over F B q Recovery of u i at time i + T Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

42 Streaming Codes - Burst Erasure Channels C(N = 1, B, W T + 1) = T T +B s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 4 +u -4 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 B T-B B x i B T-B v 0...v 3 u 0 u 1 u 2 u 3 Assume s i F T q. Let v i F T q B, u i, p i F B q Recovery of {v i } by time T 1 Number of Unknowns: B symbols over Fq T B Number of Equations: T B symbols over F B q Recovery of u i at time i + T Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

43 Streaming Codes - Isolated Erasures C(N 2, B, W) T = 8 u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 +u -4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 u v u x i Erasures at time t = 0 and t = 8 u 0 cannot be recovered due to a repetition code Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

44 Proposed Approach: Layering C(N 2, B, W) u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 u v x i Layered Code Design Burst-Erasure Streaming Code C 1 : (u i, v i, p i + u i T ) Erasure Code: q i = M t=1 u i t H u t, q i F k q C 2 : (u i, v i, p i + u i T, q i ) R = T T + B + k, k = N T N + 1 B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

45 Proposed Approach: Layering C(N 2, B, W) s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 u v x i Layered Code Design Burst-Erasure Streaming Code C 1 : (u i, v i, p i + u i T ) Erasure Code: q i = M t=1 u i t H u t, q i F k q C 2 : (u i, v i, p i + u i T, q i ) R = T T + B + k, k = N T N + 1 B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

46 Proposed Approach: Layering C(N 2, B, W) s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 u v x i Layered Code Design Burst-Erasure Streaming Code C 1 : (u i, v i, p i + u i T ) Erasure Code: q i = M t=1 u i t H u t, q i F k q C 2 : (u i, v i, p i + u i T, q i ) R = T T + B + k, k = N T N + 1 B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

47 Proposed Approach: Layering C(N 2, B, W) s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 u v x i Layered Code Design v 0 v 2 Burst-Erasure Streaming Code C 1 : (u i, v i, p i + u i T ) Erasure Code: q i = M t=1 u i t H u t, q i F k q C 2 : (u i, v i, p i + u i T, q i ) R = T T + B + k, k = N T N + 1 B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

48 Proposed Approach: Layering C(N 2, B, W) s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 v p 0 p 4 +u -4 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 x i Layered Code Design v 0 v 2 u 0 u 2 Burst-Erasure Streaming Code C 1 : (u i, v i, p i + u i T ) Erasure Code: q i = M t=1 u i t H u t, q i F k q C 2 : (u i, v i, p i + u i T, q i ) R = T T + B + k, k = N T N + 1 B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

49 Upper Bound W T + 1 Periodic Erasure Channel with P = T + 1 N + B. B erasures in each burst. Guard Interval = T + 1 N. Link: B T + 1 N B T + 1 N B T + 1 N Show that any low-delay code recovers every symbol on this erasure channel. R T + 1 N T + 1 N + B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

50 Simulation Results Gilbert-Eliott Channel (α, β) = (5 10 4, 0.5), T = 12 and R = 12/23 Gilbert Elliott Channel Good State: Pr(loss) = ε Bad State: Pr(loss) = 1 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

51 Simulation Results Gilbert-Eliott Channel (α, β) = (5 10 4, 0.5), T = 12 and R = 12/23 Loss Probability Gilbert Channel (, ) = (5x10 4,0.5) T = 12, Rate = 12/ Uncoded Maximally Short Code (N,B) = (1,11) Strongly MDS Code (N,B) = (6,6) MiDAS Code (N,B) = (2,9) x 10 3 Code N B Code N B Strongly MDS 6 6 MiDAS 2 9 Burst-Erasure 1 11 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

52 Simulation Results-II Fritchman Channel (α, β) = (1e 5, 0.5) and T = 40 and R = 40/79, 9 states α Good E 1 E 2 E 3 E N-1 E N 1- α α = 1e 5 β = Histogram of Burst Lengths for 9 States Fritchman Channel (, ) = (1E 5,0.5) Analytical Actual Probability of Occurence Burst Length Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

53 Simulation Results-II Fritchman Channel (α, β) = (1e 5, 0.5) and T = 40 and R = 40/79, 9 states Uncoded RLC SCo E RLC (Shift = 32) E RLC (Shift = 36) 9 States Fritchman Channel (, ) = (1e 005,0.5), T = 40 Loss Probability x 10 3 Code N B Code N B Strongly MDS MiDAS-I 8 31 Burst Erasure 1 39 MiDAS-II 4 35 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

54 Simulation Results - III Gilbert-Eliott Channel (α, β) = (5 10 5, 0.2), T = 50 and R Gilbert Channel (, ) = (5x10 5,0.2), Simulation Length = 10 8, T = 50, Rate = 50/ MS (N,B) = (1,33) S BEC (N,B) = (20,20) MIDAS (N,B) = (4,30) PRC (N,B) = (4,25) Loss Probability x 10 3 Code N B Code N B Strongly MDS MiDAS 4 30 Burst-Erasure 1 33 PRC 4 25 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

55 Streaming Codes Burst plus Isolated Erasures Badr-Khisti-Tan-Apostolopoulos (2013) When do Earlier Constructions Fail? s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 4 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -4 B T-B B x i v 0...v 2??? Burst spans t [0, 2] Isolated Erasure i [7, 10] Interference from v i during the recovery of u 0,..., u 2 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

56 Streaming Codes Burst plus Isolated Erasures Badr-Khisti-Tan-Apostolopoulos (2013) Layered Construction for Partial Recovery s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 v p 0 p 4 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 r u s0 0 rs1 1 rs2 2 r 3 r 4 r 5 r 6 r 7 r 8 r 9 r 10 r 11 r x i r i = M t=1 v i t H v t Shift in Repetition Code: p i + u i (T ) Isolated Loss: v ( + 1)r Burst Loss: v(b + 1) (T B)(u + r) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

57 Streaming Codes Burst plus Isolated Erasures Badr-Khisti-Tan-Apostolopoulos (2013) Layered Construction for Partial Recovery s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 v p 0 p 4 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 r u s0 0 rs1 1 rs2 2 r 3 r 4 r 5 r 6 r 7 r 8 r 9 r 10 r 11 r x i r i = M t=1 v i t H v t Shift in Repetition Code: p i + u i (T ) Isolated Loss: v ( + 1)r Burst Loss: v(b + 1) (T B)(u + r) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

58 Streaming Codes Burst plus Isolated Erasures Badr-Khisti-Tan-Apostolopoulos (2013) Layered Construction for Partial Recovery s i T=8 u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 v p 0 p 1 p 2 p 3 +u u u -4-6 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u 4 +u 5 r x i rs0 0 r u s1 1 rs2 2 r 3 r 4 r 5 r 6 r 7 r 8 r 9 r 10 r 11 T eff =6 =2 v 6 r i = M t=1 v i t H v t Shift in Repetition Code: p i + u i (T ) Isolated Loss: v ( + 1)r Burst Loss: v(b + 1) (T B)(u + r) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

59 Streaming Codes Burst plus Isolated Erasures Badr-Khisti-Tan-Apostolopoulos (2013) Layered Construction for Partial Recovery s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 v p 0 T=8 p 4 +u -4-2 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u u u u u u u u 0 2 +u 1 3 +u 2 4 +u 3 5 r x i r s0 0 rs1 1 r u s2 2 r 3 r 4 r 5 r 6 r 7 r 8 r 9 r 10 r 11 T eff =6 =2 v 6 v 0 v 1 v 2 r i = M t=1 v i t H v t Shift in Repetition Code: p i + u i (T ) Isolated Loss: v ( + 1)r Burst Loss: v(b + 1) (T B)(u + r) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

60 Streaming Codes Burst plus Isolated Erasures Badr-Khisti-Tan-Apostolopoulos (2013) s i T=8 u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 v p 0 p 4 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u u u u u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u 4 +u 5 r u s0 0 rs1 1 rs2 2 r 3 r 4 r 5 r 6 r 7 r 8 r 9 r 10 r 11 r T eff =6 =2 v 0 v 1 v 2 x i Code-Params v = (T + 1)( B 1) u =(B + 1)(T + 1) ( B 1) r = ( B 1) Rate R = u + v 2u + v + r (T ) + (B + 1) = (T ) + (B + 1)(T + 2) = T + 1 T B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

61 Streaming Codes Burst plus Isolated Erasures Badr-Khisti-Tan-Apostolopoulos (2013) s i T=8 u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 v p 0 p 4 p 1 p 2 p 3 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u u u u u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u 4 +u 5 r u s0 0 rs1 1 rs2 2 r 3 r 4 r 5 r 6 r 7 r 8 r 9 r 10 r 11 r T eff =6 =2 v 0 v 1 v 2 x i Layering Principle Start with a code for C(N, B, W ) Add additional parity check symbols for more sophisticated patterns. Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

62 Unequal Source/Channel Rates Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

63 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

64 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

65 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) n M = k n M Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

66 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) n M = k n M Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

67 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) n M = k n M Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

68 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) n M = k n M Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

69 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) = k n M n M Packet erasure channel: erasure burst of maximum B channel packets Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

70 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) = k n M n M Packet erasure channel: erasure burst of maximum B channel packets Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

71 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) n M = k n M Packet erasure channel: erasure burst of maximum B channel packets Delay-constrained decoder: s[i] recovered by macro-packet i + T Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

72 Unequal Source/Channel Rates Source model: i.i.d. process with s[i] uniform over F k q Streaming encoder:x[i, j] = f i,j(s[0], s[1],, s[i]) F n q Macro-packet: X[i, :] = [x[i, 1]... x[i, M]] Rate: R = H(s) n M = k n M Packet erasure channel: erasure burst of maximum B channel packets Delay-constrained decoder: s[i] recovered by macro-packet i + T Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

73 Unequal Source/Channel Rates Source-splitting based scheme Source Stream Split Source s 1 s 2 s 3 s 4 s 5 s 6 s 11 s 12 s 13 s 21 s 22 s 23 s 31 s 32 s 33 s 41 s 42 s 43 s 51 s 52 s 53 s 61 Channel Stream x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 x 11 x 12 x 13 x 14 x 15 x 16 Received Stream y 1 y 2 y 3 y 4 y 5 y 6 y 7 y 8 y 9 y 10 y 11 y 12 y 13 y 14 y 15 y 16 T'=6 s 11 s 12 s 13 s 21 s 22 s 23 s 31 s 32 s 33 Split s[i] into sub-symbols (s[i, 1], s[i, 2],..., s[i, M]). Apply a streaming code for M = 1. Decoding Delay T = M T. R = MT MT +B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

74 Unequal Source/Channel Rates Patil-Badr-Khisti-Tan M = 20, T = 5 Optimal Codes SCo Codes MDP Codes Rate (R) Burst Length (B) Source Splitting: R = MT MT +B Strongly MDS: R = 1 B M(T +1) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

75 Unequal Source/Channel Rates Patil-Badr-Khisti-Tan M = 20, T = 5 Optimal Codes SCo Codes MDP Codes Rate (R) Burst Length (B) Source Splitting: R = MT MT +B Strongly MDS: R = 1 B M(T +1) Capacity: Let B = bm + B, where B {0,..., M 1} { T T +b C =, 0 B b T +b M M(T +b+1) B b, M(T +b+1) T +b M B M 1 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

76 Optimal Code Construction I Patil-Badr-Khisti-Tan 2013 Key Idea: 1 Apply code for M = 1 on complete source-packet to generate the entire macro-packet. 2 Map/reshape the generated macro-packet into M individual channel-packets. Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

77 Optimal Code Construction II Patil-Badr-Khisti-Tan Source splitting split s[i] into into two groups s[i] = (s 1 [i],..., s k [i]) = (u 1 [i],..., u ku [i], v }{{} 1 [i],..., v kv [i]) }{{} u vec[i] v vec[i] Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

78 Optimal Code Construction III Patil-Badr-Khisti-Tan Parity generation layer 1: (k v + k u, k v, T ) Strongly-MDS code applied to v vec [ ] generating p vec [i] layer 2: repetition code on u vec with a shift of T Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

79 Optimal Code Construction IV Patil-Badr-Khisti-Tan 2013 Overall combined parity: q vec [i] = p vec [i] + u vec [i T ] Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

80 Optimal Code Construction V Patil-Badr-Khisti-Tan Mapping of macro-packet to individual channel-packets Map the generated macro-packet into M individual channel-packets } Rate of the code= ku+kv 2k u+k v Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

81 Optimal Code Construction VI Patil-Badr-Khisti-Tan 2013 Overall macro-packet structure: Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

82 Decoding Analysis Patil-Badr-Khisti-Tan 2013 Key Fact: The worst case burst pattern starts at the beginning of the macro-packet. Let B = bm + B. Two cases depending on whether B (1 R)M: Burst only erases symbols from u vec[i + b] Recovery of v symbols: (k u + k v)b = k ut Optimal spitting ratio: k u kv = b T b R = T T +b Burst erases symbols from v vec[i + b] Recovery of v symbols: (k u + k v)b + (B n k u) = k ut Optimal spitting ratio: k u B kv = M(T +b+1) 2B R = M(T +b+1) B M(T +b+1) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

83 Capacity Result Theorem For the streaming setup considered, with any M, T and B, the streaming capacity C is given by the following expression: T T +b, B b T +bm, T b, M(T +b+1) B C = M(T +b+1), B > b T +bm, T > b, M B M, B > M 2, T = b, 0, T < b. where the constants b and B are defined via B = bm + B, B {0, 1,..., M 1}, b N 0. Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

84 Robust Extension for M = 1 Append an additional layer parity checks containing Strongly-MDS code on u u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 u v x i R = u+v 2u+v+k. Very close to being optimal for k = N T N+1 B Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

85 Robust Extension for any M Many possibilities for the placement of Strongly-MDS parities for u! Approach I: Repetition code replaced by Strongly-MDS code for u Rate of the code unchanged. N = r is achievable. Approach II Append additional layer of Strongly-MDS code for u R = ku+kv 2k u+k v+k s. k s chosen according to given N. } } Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

86 Related Works Burst Erasure Channel: Maximally Short Codes (Block Code + Interleaving), Martinian and Sundberg (IT-2004), Martinian and Trott (ISIT-2007) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

87 Related Works Burst Erasure Channel: Maximally Short Codes (Block Code + Interleaving), Martinian and Sundberg (IT-2004), Martinian and Trott (ISIT-2007) Other Variations of Streaming Codes Burst and isolated loss patterns (Badr-Khisti-Tan-Apostolopoulos, ISIT 2013) Unequal Source Channel Rates (Patil-Badr-Khisti-Tan Asilomar 2013) Multicast Extension (Khisti-Singh 2009, Badr-Lui-Khisti Allerton 2010) Parallel Channels (Lui-Badr-Khisti CWIT 2011) Multi-Source Streaming Codes (Lui Thesis, 2011) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

88 Related Works Burst Erasure Channel: Maximally Short Codes (Block Code + Interleaving), Martinian and Sundberg (IT-2004), Martinian and Trott (ISIT-2007) Other Variations of Streaming Codes Burst and isolated loss patterns (Badr-Khisti-Tan-Apostolopoulos, ISIT 2013) Unequal Source Channel Rates (Patil-Badr-Khisti-Tan Asilomar 2013) Multicast Extension (Khisti-Singh 2009, Badr-Lui-Khisti Allerton 2010) Parallel Channels (Lui-Badr-Khisti CWIT 2011) Multi-Source Streaming Codes (Lui Thesis, 2011) Connections between Network Coding and Real-Time Streaming Codes (Tekin-Ho-Yao-Jaggi ITA-2012, Leong and Ho ISIT-2013) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

89 Related Works Burst Erasure Channel: Maximally Short Codes (Block Code + Interleaving), Martinian and Sundberg (IT-2004), Martinian and Trott (ISIT-2007) Other Variations of Streaming Codes Burst and isolated loss patterns (Badr-Khisti-Tan-Apostolopoulos, ISIT 2013) Unequal Source Channel Rates (Patil-Badr-Khisti-Tan Asilomar 2013) Multicast Extension (Khisti-Singh 2009, Badr-Lui-Khisti Allerton 2010) Parallel Channels (Lui-Badr-Khisti CWIT 2011) Multi-Source Streaming Codes (Lui Thesis, 2011) Connections between Network Coding and Real-Time Streaming Codes (Tekin-Ho-Yao-Jaggi ITA-2012, Leong and Ho ISIT-2013) Tree Codes: Schulman (IT 1996), Sahai (2001), Martinian and Wornell (Allerton 2004), Sukhavasi and Hassibi (2011) Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

90 Distance and Span Properties Append an additional layer parity checks containing Strongly-MDS code on u u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 u v x i R = u+v 2u+v+k. Very close to being optimal for k = N T N+1 B MiDAS (Near) Maximum Distance And Span tradeoff Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

91 Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j States Trellis Diagram Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

92 Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j States Trellis Diagram Free Distance Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

93 Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j States Column Distance in [0,3] Column Distance: d T G 0 G 1... G T [ ] 0 G 0... G T 1 d T = min wt s0... s T [s 0,...,s T ]..... s G 0 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

94 Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j States Column Distance in [0,3] Column Distance: d T G 0 G 1... G T [ ] 0 G 0... G T 1 d T = min wt s0... s T [s 0,...,s T ]..... s G 0 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

95 Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j States Trellis Column Diagram Distance Span in [0,3] Free in [0,3] Distance Column Span: c T G 0 G 1... G T [ ] 0 G 0... G T 1 c T = min span s0... s T [s 0,...,s T ]..... s G 0 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

96 Column-Distance & Column Span Tradeoff Badr-Khisti-Tan-Apostolopoulos (2013) Theorem Consider a C(N, B, W ) channel with delay T and W T + 1. A streaming code is feasible over this channel if and only if it satisfies: d T N + 1 and c T B + 1 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

97 Column-Distance & Column Span Tradeoff Badr-Khisti-Tan-Apostolopoulos (2013) Theorem Consider a C(N, B, W ) channel with delay T and W T + 1. A streaming code is feasible over this channel if and only if it satisfies: d T N + 1 and c T B + 1 Theorem For any rate R convolutional code and any T 0 the Column-Distance d T and Column-Span c T satisfy the following: ( R 1 R ) c T + d T T R There exists a rate R code (MiDAS Code) over a sufficiently large field that satisfies: ( ) R c T + d T T R 1 R Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

98 Multicast (Low-Delay) Codes Burst Erasure Broadcast Channel Src. Stream Encoder X[i] B 1 Decoder 1 Delay = T 1 B 2 Decoder 2 Delay = T 2 Motivation B 1 < B 2 Receiver 1 : Good Channel State Receiver 2: Weaker Channel State Delay adapts to Channel State Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

99 Channel-Adaptive Delay Example: B 1 = 2, T 1 = 4, B 2 = 3 and T 2 = 5. B 1 B 2 x 0 x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 x 11 x 12 x 13 x 14 Output s 0 s 1 s 2 s 3 s 4,s 5 s 6 s 7 s 8 T 1 T 2 A burst of length B 1 results in a delay of T 1. A burst of length B 2 results in a delay of T 2. Stretch-Factor: s = T 2 B 1 T 1 B 1 Tradeoff between s and Pr(loss). Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

100 Diversity Embedded Streaming Codes: DE-SCo Badr-Khisti-Martinian 2011 Burst Erasure Broadcast Channel Src. Stream Encoder X[i] B 1 Decoder 1 Delay = T 1 B 2 Decoder 2 Delay = T 2 Theorem There exists a {(B 1, T 1 ), (B 2, T 2 )} DE-SCo construction of rate R = T 1 T 1 +B 1 provided T 2 T 2 B 2 B 1 T 1 +B 1. This code has polynomial-time encoding and decoding complexity. Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

101 Multicast (Low Delay) Capacity Burst Erasure Broadcast Channel Src. Stream Encoder X[i] B 1 Decoder 1 Delay = T 1 B 2 Decoder 2 Delay = T 2 Capacity Function Capacity function C(T 1, T 2, B 1, B 2 ) Single User Upper Bound: C min ( T1 T 1 +B 1, Concatenation Lower Bound: C 1 1+ B 1 T 1 + B 2 T 2 ) T 2 T 2 +B 2 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

102 Multicast (Low Delay) Capacity Badr-Khisti-Lui 2011 Assume w.l.o.g. B 2 B 1 T 2 T B 2 2 T1 B1 B1 Region A Capacity T2 T B 2 2 B C B 1 + B 2 E (B 1, B 2 ) DE-SCo B 2 D T 2 =T 1 +B 2 T 2 =T 1 SCo A T 1 B C D T1 T B T1 T B E?? T2 B1 T B B 2 2 Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

103 Other Extensions Multiple Erasure Bursts (Li-Khisti-Girod 2011) - Interleaved Low-Delay Codes Multiple Links (Lui-Badr-Khisti 2011) - Layered coding for burst erasure channels Multiple Source Streams with Different Decoding Delays (Lui 2011) - Embedded Codes Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

104 Conclusions Error Correction Codes for Real-Time Streaming Deterministic Channel Models C(N, B, W ) Tradeoff between achievable N and B MiDAS Constructions Column-Distance and Column-Span Tradeoff Partial Recovery Codes for Burst + Isolated Erasures Unequal Source-Channel Rates Sequential and Adaptive Information Theory Workshop McGill University Nov 7, / 42

Error Control for Real-Time Streaming Applications

Error Control for Real-Time Streaming Applications Error Control for Real-Time Streaming Applications Ashish Khisti Department of Electrical and Computer Engineering University of Toronto Joint work with: Ahmed Badr (Toronto), Wai-Tian Tan (Cisco), Xiaoqing

More information

(Structured) Coding for Real-Time Streaming Communication

(Structured) Coding for Real-Time Streaming Communication (Structured) Coding for Real-Time Streaming Communication Ashish Khisti Department of Electrical and Computer Engineering University of Toronto Joint work with: Ahmed Badr (Toronto), Farrokh Etezadi (Toronto),

More information

MANY multimedia applications such as interactive audio/video conferencing, mobile gaming and cloud-computing require

MANY multimedia applications such as interactive audio/video conferencing, mobile gaming and cloud-computing require Layered Constructions for Low-Delay Streaming Codes Ahmed Badr, Student Member, IEEE, Pratik Patil, Ashish Khisti, Member, IEEE, Wai-Tian Tan, Member, IEEE and John Apostolopoulos, Fellow, IEEE 1 arxiv:1308.3827v1

More information

AN increasing number of multimedia applications require. Streaming Codes with Partial Recovery over Channels with Burst and Isolated Erasures

AN increasing number of multimedia applications require. Streaming Codes with Partial Recovery over Channels with Burst and Isolated Erasures 1 Streaming Codes with Partial Recovery over Channels with Burst and Isolated Erasures Ahmed Badr, Student Member, IEEE, Ashish Khisti, Member, IEEE, Wai-Tian Tan, Member IEEE and John Apostolopoulos,

More information

Streaming-Codes for Multicast over Burst Erasure Channels

Streaming-Codes for Multicast over Burst Erasure Channels 1 Streaming-Codes for Multicast over Burst Erasure Channels Ahmed Badr, Devin Lui and Ashish Khisti School of Electrical and Computer Engineering University of Toronto Toronto, ON, M5S 3G4, Canada Email:

More information

Convolutional Codes with Maximum Column Sum Rank for Network Streaming

Convolutional Codes with Maximum Column Sum Rank for Network Streaming 1 Convolutional Codes with Maximum Column Sum Rank for Network Streaming Rafid Mahmood, Ahmed Badr, and Ashish Khisti School of Electrical and Computer Engineering University of Toronto Toronto, ON, M5S

More information

Distributed Reed-Solomon Codes

Distributed Reed-Solomon Codes Distributed Reed-Solomon Codes Farzad Parvaresh f.parvaresh@eng.ui.ac.ir University of Isfahan Institute for Network Coding CUHK, Hong Kong August 216 Research interests List-decoding of algebraic codes

More information

A Truncated Prediction Framework for Streaming over Erasure Channels

A Truncated Prediction Framework for Streaming over Erasure Channels A Truncated Prediction Framework for Streaming over Erasure Channels Farrokh Etezadi, Ashish Khisti, Jun Chen Abstract We propose a new coding technique for sequential transmission of a stream of Gauss-Markov

More information

Iterative Quantization. Using Codes On Graphs

Iterative Quantization. Using Codes On Graphs Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs Lossy Data Compression: Encoding: Map source

More information

Weakly Secure Data Exchange with Generalized Reed Solomon Codes

Weakly Secure Data Exchange with Generalized Reed Solomon Codes Weakly Secure Data Exchange with Generalized Reed Solomon Codes Muxi Yan, Alex Sprintson, and Igor Zelenko Department of Electrical and Computer Engineering, Texas A&M University Department of Mathematics,

More information

Compressed Sensing and Linear Codes over Real Numbers

Compressed Sensing and Linear Codes over Real Numbers Compressed Sensing and Linear Codes over Real Numbers Henry D. Pfister (joint with Fan Zhang) Texas A&M University College Station Information Theory and Applications Workshop UC San Diego January 31st,

More information

Correcting Localized Deletions Using Guess & Check Codes

Correcting Localized Deletions Using Guess & Check Codes 55th Annual Allerton Conference on Communication, Control, and Computing Correcting Localized Deletions Using Guess & Check Codes Salim El Rouayheb Rutgers University Joint work with Serge Kas Hanna and

More information

Distributed Decoding of Convolutional Network Error Correction Codes

Distributed Decoding of Convolutional Network Error Correction Codes 1 Distributed Decoding of Convolutional Network Error Correction Codes Hengjie Yang and Wangmei Guo arxiv:1701.06283v2 [cs.it] 18 Feb 2017 Abstract A Viterbi-like decoding algorithm is proposed in this

More information

Common Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel

Common Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel Wang, ISIT 2011 p. 1/15 Common Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel Chih-Chun Wang, Jaemin Han Center of Wireless Systems and Applications (CWSA) School

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved. Introduction to Wireless & Mobile Systems Chapter 4 Channel Coding and Error Control 1 Outline Introduction Block Codes Cyclic Codes CRC (Cyclic Redundancy Check) Convolutional Codes Interleaving Information

More information

On queueing in coded networks queue size follows degrees of freedom

On queueing in coded networks queue size follows degrees of freedom On queueing in coded networks queue size follows degrees of freedom Jay Kumar Sundararajan, Devavrat Shah, Muriel Médard Laboratory for Information and Decision Systems, Massachusetts Institute of Technology,

More information

Raptor Codes: From a Math Idea to LTE embms. BIRS, October 2015

Raptor Codes: From a Math Idea to LTE embms. BIRS, October 2015 Raptor Codes: From a Math Idea to LTE embms BIRS, October 2015 The plan is to... 1 introduce LT codes and Raptor codes 2 provide insights into their design 3 address some common misconceptions 2 / 31 The

More information

On Weight Enumerators and MacWilliams Identity for Convolutional Codes

On Weight Enumerators and MacWilliams Identity for Convolutional Codes On Weight Enumerators and MacWilliams Identity for Convolutional Codes Irina E Bocharova 1, Florian Hug, Rolf Johannesson, and Boris D Kudryashov 1 1 Dept of Information Systems St Petersburg Univ of Information

More information

Section 3 Error Correcting Codes (ECC): Fundamentals

Section 3 Error Correcting Codes (ECC): Fundamentals Section 3 Error Correcting Codes (ECC): Fundamentals Communication systems and channel models Definition and examples of ECCs Distance For the contents relevant to distance, Lin & Xing s book, Chapter

More information

Interference Alignment in Regenerating Codes for Distributed Storage: Necessity and Code Constructions

Interference Alignment in Regenerating Codes for Distributed Storage: Necessity and Code Constructions Interference Alignment in Regenerating Codes for Distributed Storage: Necessity and Code Constructions Nihar B Shah, K V Rashmi, P Vijay Kumar, Fellow, IEEE, and Kannan Ramchandran, Fellow, IEEE Abstract

More information

Large Deviations for Channels with Memory

Large Deviations for Channels with Memory Large Deviations for Channels with Memory Santhosh Kumar Jean-Francois Chamberland Henry Pfister Electrical and Computer Engineering Texas A&M University September 30, 2011 1/ 1 Digital Landscape Delay

More information

Multiple Access Network Information-flow And Correction codes

Multiple Access Network Information-flow And Correction codes Multiple Access Network Information-flow And Correction codes Hongyi Yao 1, Theodoros K. Dikaliotis, Sidharth Jaggi, Tracey Ho 1 Tsinghua University California Institute of Technology Chinese University

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g Exercise Generator polynomials of a convolutional code, given in binary form, are g 0, g 2 0 ja g 3. a) Sketch the encoding circuit. b) Sketch the state diagram. c) Find the transfer function TD. d) What

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

Reed-Solomon codes. Chapter Linear codes over finite fields

Reed-Solomon codes. Chapter Linear codes over finite fields Chapter 8 Reed-Solomon codes In the previous chapter we discussed the properties of finite fields, and showed that there exists an essentially unique finite field F q with q = p m elements for any prime

More information

Turbo Codes for Deep-Space Communications

Turbo Codes for Deep-Space Communications TDA Progress Report 42-120 February 15, 1995 Turbo Codes for Deep-Space Communications D. Divsalar and F. Pollara Communications Systems Research Section Turbo codes were recently proposed by Berrou, Glavieux,

More information

Lecture 4 : Introduction to Low-density Parity-check Codes

Lecture 4 : Introduction to Low-density Parity-check Codes Lecture 4 : Introduction to Low-density Parity-check Codes LDPC codes are a class of linear block codes with implementable decoders, which provide near-capacity performance. History: 1. LDPC codes were

More information

On ARQ for Packet Erasure Channels with Bernoulli Arrivals

On ARQ for Packet Erasure Channels with Bernoulli Arrivals On ARQ for Packet Erasure Channels with Bernoulli Arrivals Dinkar Vasudevan, Vijay G. Subramanian and Douglas J. Leith Hamilton Institute, National University of Ireland, Maynooth Abstract We study packet

More information

Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding

Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding Aline Roumy, Khaled Lajnef, Christine Guillemot 1 INRIA- Rennes, Campus de Beaulieu, 35042 Rennes Cedex, France Email: aroumy@irisa.fr Abstract

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

Nonlinear Turbo Codes for the broadcast Z Channel

Nonlinear Turbo Codes for the broadcast Z Channel UCLA Electrical Engineering Department Communication Systems Lab. Nonlinear Turbo Codes for the broadcast Z Channel Richard Wesel Miguel Griot Bike ie Andres Vila Casado Communication Systems Laboratory,

More information

Practical Polar Code Construction Using Generalised Generator Matrices

Practical Polar Code Construction Using Generalised Generator Matrices Practical Polar Code Construction Using Generalised Generator Matrices Berksan Serbetci and Ali E. Pusane Department of Electrical and Electronics Engineering Bogazici University Istanbul, Turkey E-mail:

More information

Decomposing the MIMO Wiretap Channel

Decomposing the MIMO Wiretap Channel Decomposing the MIMO Wiretap Channel Anatoly Khina, Tel Aviv University Joint work with: Yuval Kochman, Hebrew University Ashish Khisti, University of Toronto ISIT 2014 Honolulu, Hawai i, USA June 30,

More information

ANALYSIS OF CONTROL PROPERTIES OF CONCATENATED CONVOLUTIONAL CODES

ANALYSIS OF CONTROL PROPERTIES OF CONCATENATED CONVOLUTIONAL CODES CYBERNETICS AND PHYSICS, VOL 1, NO 4, 2012, 252 257 ANALYSIS OF CONTROL PROPERTIES OF CONCATENATED CONVOLUTIONAL CODES M I García-Planas Matemàtica Aplicada I Universitat Politècnica de Catalunya Spain

More information

Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel

Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel /33 Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel Presented by Paul de Kerret Joint work with Antonio Bazco, Nicolas Gresset, and David Gesbert ESIT 2017 in Madrid,

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Coding for loss tolerant systems

Coding for loss tolerant systems Coding for loss tolerant systems Workshop APRETAF, 22 janvier 2009 Mathieu Cunche, Vincent Roca INRIA, équipe Planète INRIA Rhône-Alpes Mathieu Cunche, Vincent Roca The erasure channel Erasure codes Reed-Solomon

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

Efficiently decodable codes for the binary deletion channel

Efficiently decodable codes for the binary deletion channel Efficiently decodable codes for the binary deletion channel Venkatesan Guruswami (venkatg@cs.cmu.edu) Ray Li * (rayyli@stanford.edu) Carnegie Mellon University August 18, 2017 V. Guruswami and R. Li (CMU)

More information

Coding with Constraints: Minimum Distance. Bounds and Systematic Constructions

Coding with Constraints: Minimum Distance. Bounds and Systematic Constructions Coding with Constraints: Minimum Distance 1 Bounds and Systematic Constructions Wael Halbawi, Matthew Thill & Babak Hassibi Department of Electrical Engineering arxiv:1501.07556v2 [cs.it] 19 Feb 2015 California

More information

Performance of Multi Binary Turbo-Codes on Nakagami Flat Fading Channels

Performance of Multi Binary Turbo-Codes on Nakagami Flat Fading Channels Buletinul Ştiinţific al Universităţii "Politehnica" din Timişoara Seria ELECTRONICĂ şi TELECOMUNICAŢII TRANSACTIONS on ELECTRONICS and COMMUNICATIONS Tom 5(65), Fascicola -2, 26 Performance of Multi Binary

More information

Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel

Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel Brian M. Kurkoski, Paul H. Siegel, and Jack K. Wolf Department of Electrical and Computer Engineering

More information

Codes on Graphs, Normal Realizations, and Partition Functions

Codes on Graphs, Normal Realizations, and Partition Functions Codes on Graphs, Normal Realizations, and Partition Functions G. David Forney, Jr. 1 Workshop on Counting, Inference and Optimization Princeton, NJ November 3, 2011 1 Joint work with: Heide Gluesing-Luerssen,

More information

Fractional Repetition Codes For Repair In Distributed Storage Systems

Fractional Repetition Codes For Repair In Distributed Storage Systems Fractional Repetition Codes For Repair In Distributed Storage Systems 1 Salim El Rouayheb, Kannan Ramchandran Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley {salim,

More information

6.895 PCP and Hardness of Approximation MIT, Fall Lecture 3: Coding Theory

6.895 PCP and Hardness of Approximation MIT, Fall Lecture 3: Coding Theory 6895 PCP and Hardness of Approximation MIT, Fall 2010 Lecture 3: Coding Theory Lecturer: Dana Moshkovitz Scribe: Michael Forbes and Dana Moshkovitz 1 Motivation In the course we will make heavy use of

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 24: Error Correction Techniques Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt May 14 th, 2015 1 Error Correction Techniques olinear Block Code Cyclic

More information

Training-Symbol Embedded, High-Rate Complex Orthogonal Designs for Relay Networks

Training-Symbol Embedded, High-Rate Complex Orthogonal Designs for Relay Networks Training-Symbol Embedded, High-Rate Complex Orthogonal Designs for Relay Networks J Harshan Dept of ECE, Indian Institute of Science Bangalore 56001, India Email: harshan@eceiiscernetin B Sundar Rajan

More information

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Achaleshwar Sahai Department of ECE, Rice University, Houston, TX 775. as27@rice.edu Vaneet Aggarwal Department of

More information

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User

More information

Robust Network Codes for Unicast Connections: A Case Study

Robust Network Codes for Unicast Connections: A Case Study Robust Network Codes for Unicast Connections: A Case Study Salim Y. El Rouayheb, Alex Sprintson, and Costas Georghiades Department of Electrical and Computer Engineering Texas A&M University College Station,

More information

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User

More information

Sequential Coding of Gauss Markov Sources With Packet Erasures and Feedback

Sequential Coding of Gauss Markov Sources With Packet Erasures and Feedback Sequential Coding of Gauss Markov Sources With Packet Erasures and Feedback Anatoly Khina, Victoria Kostina, Ashish Khisti, and Babak Hassibi Abstract We consider the problem of sequential transmission

More information

Decoding of Convolutional Codes over the Erasure Channel

Decoding of Convolutional Codes over the Erasure Channel 1 Decoding of Convolutional Codes over the Erasure Channel Virtudes Tomás, Joachim Rosenthal, Senior Member, IEEE, and Roxana Smarandache, Member, IEEE arxiv:10063156v3 [csit] 31 Aug 2011 Abstract In this

More information

On the Capacity of Secure Network Coding

On the Capacity of Secure Network Coding On the Capacity of Secure Network Coding Jon Feldman Dept. of IEOR Tal Malkin Dept. of CS Rocco A. Servedio Dept. of CS Columbia University, New York, NY {jonfeld@ieor, tal@cs, rocco@cs, cliff@ieor}.columbia.edu

More information

On the Existence of Optimal Exact-Repair MDS Codes for Distributed Storage

On the Existence of Optimal Exact-Repair MDS Codes for Distributed Storage On the Existence of Optimal Exact-Repair MDS Codes for Distributed Storage Changho Suh Kannan Ramchandran Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report

More information

Sector-Disk Codes and Partial MDS Codes with up to Three Global Parities

Sector-Disk Codes and Partial MDS Codes with up to Three Global Parities Sector-Disk Codes and Partial MDS Codes with up to Three Global Parities Junyu Chen Department of Information Engineering The Chinese University of Hong Kong Email: cj0@alumniiecuhkeduhk Kenneth W Shum

More information

Distributed storage systems from combinatorial designs

Distributed storage systems from combinatorial designs Distributed storage systems from combinatorial designs Aditya Ramamoorthy November 20, 2014 Department of Electrical and Computer Engineering, Iowa State University, Joint work with Oktay Olmez (Ankara

More information

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus Turbo Compression Andrej Rikovsky, Advisor: Pavol Hanus Abstract Turbo codes which performs very close to channel capacity in channel coding can be also used to obtain very efficient source coding schemes.

More information

Correcting Bursty and Localized Deletions Using Guess & Check Codes

Correcting Bursty and Localized Deletions Using Guess & Check Codes Correcting Bursty and Localized Deletions Using Guess & Chec Codes Serge Kas Hanna, Salim El Rouayheb ECE Department, Rutgers University serge..hanna@rutgers.edu, salim.elrouayheb@rutgers.edu Abstract

More information

Turbo Codes for xdsl modems

Turbo Codes for xdsl modems Turbo Codes for xdsl modems Juan Alberto Torres, Ph. D. VOCAL Technologies, Ltd. (http://www.vocal.com) John James Audubon Parkway Buffalo, NY 14228, USA Phone: +1 716 688 4675 Fax: +1 716 639 0713 Email:

More information

Explicit MBR All-Symbol Locality Codes

Explicit MBR All-Symbol Locality Codes Explicit MBR All-Symbol Locality Codes Govinda M. Kamath, Natalia Silberstein, N. Prakash, Ankit S. Rawat, V. Lalitha, O. Ozan Koyluoglu, P. Vijay Kumar, and Sriram Vishwanath 1 Abstract arxiv:1302.0744v2

More information

The E8 Lattice and Error Correction in Multi-Level Flash Memory

The E8 Lattice and Error Correction in Multi-Level Flash Memory The E8 Lattice and Error Correction in Multi-Level Flash Memory Brian M Kurkoski University of Electro-Communications Tokyo, Japan kurkoski@iceuecacjp Abstract A construction using the E8 lattice and Reed-Solomon

More information

Fountain Uncorrectable Sets and Finite-Length Analysis

Fountain Uncorrectable Sets and Finite-Length Analysis Fountain Uncorrectable Sets and Finite-Length Analysis Wen Ji 1, Bo-Wei Chen 2, and Yiqiang Chen 1 1 Beijing Key Laboratory of Mobile Computing and Pervasive Device Institute of Computing Technology, Chinese

More information

Optimum Soft Decision Decoding of Linear Block Codes

Optimum Soft Decision Decoding of Linear Block Codes Optimum Soft Decision Decoding of Linear Block Codes {m i } Channel encoder C=(C n-1,,c 0 ) BPSK S(t) (n,k,d) linear modulator block code Optimal receiver AWGN Assume that [n,k,d] linear block code C is

More information

Optimal Scalar Linear Index Codes for Symmetric and Neighboring Side-information Problems

Optimal Scalar Linear Index Codes for Symmetric and Neighboring Side-information Problems Optimal Scalar Linear Index Codes for Symmetric and Neighboring Side-information Problems Mahesh Babu Vaddi and B Sundar Rajan Department of Electrical Communication Engineering, Indian Institute of Science,

More information

Interference Channels with Source Cooperation

Interference Channels with Source Cooperation Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL

More information

An Introduction to (Network) Coding Theory

An Introduction to (Network) Coding Theory An Introduction to (Network) Coding Theory Anna-Lena Horlemann-Trautmann University of St. Gallen, Switzerland July 12th, 2018 1 Coding Theory Introduction Reed-Solomon codes 2 Introduction Coherent network

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Guess & Check Codes for Deletions, Insertions, and Synchronization

Guess & Check Codes for Deletions, Insertions, and Synchronization Guess & Check Codes for Deletions, Insertions, and Synchronization Serge Kas Hanna, Salim El Rouayheb ECE Department, Rutgers University sergekhanna@rutgersedu, salimelrouayheb@rutgersedu arxiv:759569v3

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

An Introduction to (Network) Coding Theory

An Introduction to (Network) Coding Theory An to (Network) Anna-Lena Horlemann-Trautmann University of St. Gallen, Switzerland April 24th, 2018 Outline 1 Reed-Solomon Codes 2 Network Gabidulin Codes 3 Summary and Outlook A little bit of history

More information

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH MIMO : MIMO Theoretical Foundations of Wireless Communications 1 Wednesday, May 25, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication 1 / 20 Overview MIMO

More information

Dynamic Power Allocation and Routing for Time Varying Wireless Networks

Dynamic Power Allocation and Routing for Time Varying Wireless Networks Dynamic Power Allocation and Routing for Time Varying Wireless Networks X 14 (t) X 12 (t) 1 3 4 k a P ak () t P a tot X 21 (t) 2 N X 2N (t) X N4 (t) µ ab () rate µ ab µ ab (p, S 3 ) µ ab µ ac () µ ab (p,

More information

Regenerating Codes and Locally Recoverable. Codes for Distributed Storage Systems

Regenerating Codes and Locally Recoverable. Codes for Distributed Storage Systems Regenerating Codes and Locally Recoverable 1 Codes for Distributed Storage Systems Yongjune Kim and Yaoqing Yang Abstract We survey the recent results on applying error control coding to distributed storage

More information

An Introduction to Low Density Parity Check (LDPC) Codes

An Introduction to Low Density Parity Check (LDPC) Codes An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Linear Exact Repair Rate Region of (k + 1, k, k) Distributed Storage Systems: A New Approach

Linear Exact Repair Rate Region of (k + 1, k, k) Distributed Storage Systems: A New Approach Linear Exact Repair Rate Region of (k + 1, k, k) Distributed Storage Systems: A New Approach Mehran Elyasi Department of ECE University of Minnesota melyasi@umn.edu Soheil Mohajer Department of ECE University

More information

Progress on High-rate MSR Codes: Enabling Arbitrary Number of Helper Nodes

Progress on High-rate MSR Codes: Enabling Arbitrary Number of Helper Nodes Progress on High-rate MSR Codes: Enabling Arbitrary Number of Helper Nodes Ankit Singh Rawat CS Department Carnegie Mellon University Pittsburgh, PA 523 Email: asrawat@andrewcmuedu O Ozan Koyluoglu Department

More information

Incremental Coding over MIMO Channels

Incremental Coding over MIMO Channels Model Rateless SISO MIMO Applications Summary Incremental Coding over MIMO Channels Anatoly Khina, Tel Aviv University Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv University Gregory W. Wornell,

More information

Delay, feedback, and the price of ignorance

Delay, feedback, and the price of ignorance Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences

More information

Noisy Group Testing and Boolean Compressed Sensing

Noisy Group Testing and Boolean Compressed Sensing Noisy Group Testing and Boolean Compressed Sensing Venkatesh Saligrama Boston University Collaborator: George Atia Outline Examples Problem Setup Noiseless Problem Average error, Worst case error Approximate

More information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70 Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks

More information

Chapter 7: Channel coding:convolutional codes

Chapter 7: Channel coding:convolutional codes Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

Locally Encodable and Decodable Codes for Distributed Storage Systems

Locally Encodable and Decodable Codes for Distributed Storage Systems Locally Encodable and Decodable Codes for Distributed Storage Systems Son Hoang Dau, Han Mao Kiah, Wentu Song, Chau Yuen Singapore University of Technology and Design, Nanyang Technological University,

More information

Hybrid Noncoherent Network Coding

Hybrid Noncoherent Network Coding Hybrid Noncoherent Network Coding Vitaly Skachek, Olgica Milenkovic, Angelia Nedić University of Illinois, Urbana-Champaign 1308 W. Main Street, Urbana, IL 61801, USA Abstract We describe a novel extension

More information

THE seminal paper of Gallager [1, p. 48] suggested to evaluate

THE seminal paper of Gallager [1, p. 48] suggested to evaluate IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 11, NOVEMBER 2004 2657 Extrinsic Information Transfer Functions: Model and Erasure Channel Properties Alexei Ashikhmin, Member, IEEE, Gerhard Kramer,

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

Anatoly Khina. Joint work with: Uri Erez, Ayal Hitron, Idan Livni TAU Yuval Kochman HUJI Gregory W. Wornell MIT

Anatoly Khina. Joint work with: Uri Erez, Ayal Hitron, Idan Livni TAU Yuval Kochman HUJI Gregory W. Wornell MIT Network Modulation: Transmission Technique for MIMO Networks Anatoly Khina Joint work with: Uri Erez, Ayal Hitron, Idan Livni TAU Yuval Kochman HUJI Gregory W. Wornell MIT ACC Workshop, Feder Family Award

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG

Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG Cung Nguyen and Robert G. Redinbo Department of Electrical and Computer Engineering University of California, Davis, CA email: cunguyen,

More information

Optimal Locally Repairable and Secure Codes for Distributed Storage Systems

Optimal Locally Repairable and Secure Codes for Distributed Storage Systems Optimal Locally Repairable and Secure Codes for Distributed Storage Systems Ankit Singh Rawat, O. Ozan Koyluoglu, Natalia Silberstein, and Sriram Vishwanath arxiv:0.6954v [cs.it] 6 Aug 03 Abstract This

More information

Compressed Sensing Using Bernoulli Measurement Matrices

Compressed Sensing Using Bernoulli Measurement Matrices ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada Motivation Motivation

More information