(Structured) Coding for Real-Time Streaming Communication Ashish Khisti Department of Electrical and Computer Engineering University of Toronto Joint work with: Ahmed Badr (Toronto), Farrokh Etezadi (Toronto), Wai-Tian Tan (Cisco), John Apostolopoulos (Cisco), Mitchell Trott (Luminate Wireless) October 9, 2014 University of Michigan, Ann Arbor
Multimedia Streaming Source Stream Latency Receiver 1 1 2 3 Interrup;on 1 2 3 4 5 6 Network 1 2 3 4 Receiver 2 Latency Interrup;on Streaming Server 1 2 3 4 5 6 Receiver 3 Applications Application Bit-Rate (Mbps) MSDU (B) Delay (ms) PLR Video Conf. 2 Mbps 1500 100 ms 10 4 Interactive Gaming 1 Mbps 512 50 ms 10 4 Video Streaming 4 Mbps 1500 500 ms 10 6 October 9, 2014 University of Michigan, Ann Arbor 2/ 1
Multimedia Streaming Source Stream Latency Receiver 1 1 2 3 Interrup;on 1 2 3 4 5 6 Network 1 2 3 4 Receiver 2 Latency Interrup;on Streaming Server 1 2 3 4 5 6 Receiver 3 Packet-Loss Analysis Burst Losses - High Performance Degradation FEC vs Retransmission October 9, 2014 University of Michigan, Ann Arbor 2/ 1
Outline Real-Time Streaming Communication Streaming Error Correction Streaming Compression Collaborators: Ahmed Badr (Toronto) John Apostolopoulos (Cisco) Wai-Tian Tan (Cisco) Collaborators: Farrokh Etezadi (Toronto) Mitchell Trott October 9, 2014 University of Michigan, Ann Arbor 3/ 1
Outline Real-Time Streaming Communication Streaming Error Correction Streaming Compression Collaborators: Ahmed Badr (Toronto) John Apostolopoulos (Cisco) Wai-Tian Tan (Cisco) Collaborators: Farrokh Etezadi (Toronto) Mitchell Trott Physical Layer Security (Time Permitting) October 9, 2014 University of Michigan, Ann Arbor 3/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Real-Time Communication System October 9, 2014 University of Michigan, Ann Arbor 4/ 1
Proposed Channel Model 1- Good Bad 1- Bad State Good State Bad State Gilbert-Elliott Model Structural Properties Performance Gains October 9, 2014 University of Michigan, Ann Arbor 5/ 1
Proposed Channel Model 1- Good Bad 1- Bad State Good State Bad State Gilbert-Elliott Model Sliding Window Erasure Channel: C(N, B, W ) In any sliding window of length W, the channel can introduce only one of the following: An erasure burst of maximum length B Upto N erasures in arbitrary positions October 9, 2014 University of Michigan, Ann Arbor 5/ 1
Sliding Window Erasure Channel: Remarks (N,B,W) = (2,3,6) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 W = 6 N = 2 October 9, 2014 University of Michigan, Ann Arbor 6/ 1
Sliding Window Erasure Channel: Remarks (N,B,W) = (2,3,6) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 W = W 6= 6 N = N 2= 2 October 9, 2014 University of Michigan, Ann Arbor 6/ 1
Sliding Window Erasure Channel: Remarks (N,B,W) = (2,3,6) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 W = W 6= W 6= 6 N = N 2= N 2= 2 October 9, 2014 University of Michigan, Ann Arbor 6/ 1
Sliding Window Erasure Channel: Remarks (N,B,W) = (2,3,6) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 W = W 6= W 6= W 6= 6 N = N 2= N 2= B 2= 3 October 9, 2014 University of Michigan, Ann Arbor 6/ 1
Sliding Window Erasure Channel: Remarks (N,B,W) = (2,3,6) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 W = W 6= W 6= W 6= W 6= 6 N = N 2= N 2= B 2= B 3= 3 C(N = 1, B, W ): Burst-Erasure Channel October 9, 2014 University of Michigan, Ann Arbor 6/ 1
Sliding Window Erasure Channel: Remarks (N,B,W) = (2,3,6) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 W = W 6= W 6= W 6= W 6= 6 N = N 2= N 2= B 2= B 3= 3 C(N = 1, B, W ): Burst-Erasure Channel 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 B W-1 B W-1 October 9, 2014 University of Michigan, Ann Arbor 6/ 1
Sliding Window Erasure Channel: Remarks (N,B,W) = (2,3,6) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 W = W 6= W 6= W 6= W 6= 6 N = N 2= N 2= B 2= B 3= 3 C(N = 1, B, W ): Burst-Erasure Channel C(N, B, W ): Worst-Case B W-N B W-N 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 October 9, 2014 University of Michigan, Ann Arbor 6/ 1
Problem Setup - Sliding Window Erasure Channel Model Source Model : i.i.d. sequence s[t] p s ( )= Unif{(F q ) k } Streaming Encoder: x[t] = f t (s[0],..., s[t]), x[t] (F q ) n Erasure Channel: (Sliding Window Model) Delay-Constrained Decoder: ŝ[t] = g t (y[0],..., y[t + T ]) Rate R = k n Streaming Capacity A rate R is achievable over the C(N, B, W ) channel, if there is a sequence of encoding and decoding functions, f t ( ) and g t ( ) respectively over a sufficiently large field F q, with delay T and rate R = k n. The supremum of achievable rates is the streaming capacity. October 9, 2014 University of Michigan, Ann Arbor 7/ 1
Problem Setup - Sliding Window Erasure Channel Model Source Model : i.i.d. sequence s[t] p s ( )= Unif{(F q ) k } Streaming Encoder: x[t] = f t (s[0],..., s[t]), x[t] (F q ) n Erasure Channel: (Sliding Window Model) Delay-Constrained Decoder: ŝ[t] = g t (y[0],..., y[t + T ]) Rate R = k n Streaming Capacity A rate R is achievable over the C(N, B, W ) channel, if there is a sequence of encoding and decoding functions, f t ( ) and g t ( ) respectively over a sufficiently large field F q, with delay T and rate R = k n. The supremum of achievable rates is the streaming capacity. Worst Case Definition Arbitrarily large field size October 9, 2014 University of Michigan, Ann Arbor 7/ 1
Main Result Badr-Patil-Khisti-Tan-Apostolopoulos, IEEE Trans. on Inform. Theory (Under Review) Theorem Consider the C(N, B, W ) channel, with W B + 1, and let the delay be T. Upper-Bound For any rate R code, we have: ( R 1 R ) B + N min(w, T + 1) Lower-Bound: There exists a rate R code that satisfies: ( ) R B + N min(w, T + 1) 1. 1 R The gap between the upper and lower bound is 1 unit of delay. October 9, 2014 University of Michigan, Ann Arbor 8/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: Recover s 0, s 1, s 2, s 3 H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: Recover s 0, s 1, s 2, s 3 H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Baseline Codes - Full Rank Condition k n s 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 x i p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p i = s i H 0 + s i 1 H 1 +... + s i M H M, Erasure Codes: Recover s 0, s 1, s 2, s 3 H i F k n k q Random Linear Codes Strongly-MDS Codes (Gabidulin 88, Gluesing-Luerssen 06 ) p 4 H 4 H 3 H 2 H 1 s 0 p 5 p 6 = H 5 H 4 H 3 H 2 s 1 0 H 5 H 4 H 3 s 2 p 7 0 0 H 5 H 4 s 3 }{{} full rank October 9, 2014 University of Michigan, Ann Arbor 9/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 Rate 1/2 Baseline Erasure Codes, T = 7 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 v 8 v 9 v 10 v 11 p 8 p 9 p 10 p 11 x i v 0,v 1,v 2,v 3 October 9, 2014 University of Michigan, Ann Arbor 10/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 Rate 1/2 Baseline Erasure Codes, T = 7 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 v 8 v 9 v 10 v 11 p 8 p 9 p 10 p 11 x i Rate 1/2 Repetition Code, T = 8 v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u -8 u -7 u -6 u -5 u -4 u -3 u -2 u -1 u 8 u 9 u 10 u 11 u 0 u 1 u 2 u 3 x i u 0 u 1 u 2 u 3 October 9, 2014 University of Michigan, Ann Arbor 10/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 v u u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 u -8 u -7 u -6 u -5 u -4 u -3 u -2 u -1 u 0 u 1 u 2 u 3 u u October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 u 0 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 u -8 u 1 u -7 u 2 u -6 u 3 u -5 u 4 u -4 u 5 u -3 u 6 u -2 u 7 u -1 u 8 u 0 u 9 u 1 u 10 u 2 u 11 p 11 u 3 u v u u R= u+v 3u+v = 1 2 October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u+v 2 u+v = 2 3 October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u v 2 u v Encoding: 1 Source Splitting: s i = (u i, v i ), u i F B q, v i F T B q 2 Erasure Code on v i : Generate v i (v i, p i ) where p i F B q is obtained from a Strongly-MDS code. 3 Repetition Code on u i : Repeat the u i symbols with a shift of T 4 Merging: Combine the repeated u i s with the p i s 5 Rate: R = T T +B October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u+v 2 u+v = 2 3 v 0,v 1,v 2,v 3 Encoding: 1 Source Splitting: s i = (u i, v i ), u i F B q, v i F T B q 2 Erasure Code on v i : Generate v i (v i, p i ) where p i F B q is obtained from a Strongly-MDS code. 3 Repetition Code on u i : Repeat the u i symbols with a shift of T 4 Merging: Combine the repeated u i s with the p i s 5 Rate: R = T T +B October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u v 2 u v v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 3 Encoding: 1 Source Splitting: s i = (u i, v i ), u i F B q, v i F T B q 2 Erasure Code on v i : Generate v i (v i, p i ) where p i F B q is obtained from a Strongly-MDS code. 3 Repetition Code on u i : Repeat the u i symbols with a shift of T 4 Merging: Combine the repeated u i s with the p i s 5 Rate: R = T T +B October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code - Example B = 4, T = 8, R = T T +B = 2 3 s i u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 +u -4 u v u x i R= u+v 2 u+v = 2 3 v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 3 Encoding: s 0 s 1 s 2 s 3 1 Source Splitting: s i = (u i, v i ), u i F B q, v i F T B q 2 Erasure Code on v i : Generate v i (v i, p i ) where p i F B q is obtained from a Strongly-MDS code. 3 Repetition Code on u i : Repeat the u i symbols with a shift of T 4 Merging: Combine the repeated u i s with the p i s 5 Rate: R = T T +B October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code - Example B = 4, T = 8, R = s i T T +B = 2 3 v u 0 p v 0 v u 1 p v 1 v u 2 p v 2 v u 3 p v 3 v u 4 p v 4 v u 5 p v 5 v u 6 p v 6 v u 7 7 p v 7 7 v u 8 p v 8 v u 9 p v 9 v u 10 p v 10 v u 11 p v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 u -8 u -7 u -6 u -5 +u u -4-4 u -3 u 0 u 1 R= u u+v u v 2 u 3 u 4 u 5 u 6 u 7 8 9 10 3u+v 1 2 v u v 0,v 1,v 2,v 3 u 0 u 1 u 2 u 2 = 2 3 11 3 u -8 u -7 u -6 u -5 u -4 u -3 u -2 u -1 u 0 u 1 u 2 Encoding: u -2 u -1 u 0 u 1 u 2 u 3 u 3 s 0 s 1 s 2 s 3 uv vu u u u u x i 1 Source Splitting: s i = (u i, v i ), u i F B q, v i F T B q 2 Erasure Code on v i : Generate v i (v i, p i ) where p i F B q is obtained from a Strongly-MDS code. 3 Repetition Code on u i : Repeat the u i symbols with a shift of T 4 Merging: Combine the repeated u i s with the p i s 5 Rate: R = T T +B October 9, 2014 University of Michigan, Ann Arbor 11/ 1
Streaming Code Burst Erasure Channel 1 Source Splitting: s i = (u i, v i ), u i F B q, v i F T B q 2 Erasure Code on v i : Generate v i (v i, p i ) where p i F B q is obtained from a Strongly-MDS code. 3 Repetition Code on u i : Repeat the u i symbols with a shift of T 4 Merging: Combine the repeated u i s with the p i s October 9, 2014 University of Michigan, Ann Arbor 12/ 1
Robust Extension: C(N, B, W ) Channel Layered Code Design u 0 u 1 u 2 u 3 u 4 u 5 u 6 u 7 u 8 u 9 u 10 u 11 v 0 v 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 v 11 p 0 p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 p 9 p 10 p 11 u +u -8 +u -7 +u -6 +u -5 +u -4 +u -3 +u -2 +u -1 +u 0 +u 1 +u 2 +u 3 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 q 9 q 10 q 11 k q 0 u v x i Burst-Erasure Streaming Code: (u i, v i, p i + u i T ) Erasure Code: q i = M t=1 u i t H u t, q i F k q Concatenation: (u i, v i, p i + u i T, q i ) Attains the lower bound R = T T + B + k October 9, 2014 University of Michigan, Ann Arbor 13/ 1
Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j 0 1 2 3 4 States Trellis Diagram October 9, 2014 University of Michigan, Ann Arbor 14/ 1
Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j 0 1 2 3 4 States Trellis Diagram Free Distance October 9, 2014 University of Michigan, Ann Arbor 14/ 1
Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j Column Distance: d T G 0 G 1... G T [ ] 0 G 0... G T 1 s0... s T d T = min [s 0,...,s T ] s 0 0 wt..... 0... G 0 October 9, 2014 University of Michigan, Ann Arbor 14/ 1
Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j Column Distance: d T G 0 G 1... G T [ ] 0 G 0... G T 1 s0... s T d T = min [s 0,...,s T ] s 0 0 wt..... 0... G 0 October 9, 2014 University of Michigan, Ann Arbor 14/ 1
Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j Column Distance: d T G 0 G 1... G T [ ] 0 G 0... G T 1 s0... s T d T = min [s 0,...,s T ] s 0 0 wt..... 0... G 0 0 1 2 3 4 States Column Span in [0,3] October 9, 2014 University of Michigan, Ann Arbor 14/ 1
Distance and Span Properties Consider (n, k, m) Convolutional code: x i = m j=0 s i jg j Column Distance: d T G 0 G 1... G T [ ] 0 G 0... G T 1 s0... s T d T = min [s 0,...,s T ] s 0 0 wt..... 0... G 0 Column Span: c T G 0 G 1... G T [ ] 0 G 0... G T 1 c T = min span s0... s T [s 0,...,s T ]..... s 0 0 0... G 0 October 9, 2014 University of Michigan, Ann Arbor 14/ 1
Column-Distance & Column Span Tradeoff Badr-Patil-Khisti-Tan-Apostolopoulos 2014 Theorem Consider a C(N, B, W ) channel with delay T and W T + 1. A streaming code is feasible over this channel if and only if it satisfies: d T N + 1 and c T B + 1 October 9, 2014 University of Michigan, Ann Arbor 15/ 1
Column-Distance & Column Span Tradeoff Badr-Patil-Khisti-Tan-Apostolopoulos 2014 Theorem Consider a C(N, B, W ) channel with delay T and W T + 1. A streaming code is feasible over this channel if and only if it satisfies: d T N + 1 and c T B + 1 Theorem For any rate R convolutional code and any T 0 the Column-Distance d T and Column-Span c T satisfy the following: ( R 1 R ) c T + d T T + 1 + 1 1 R There exists a rate R code (MiDAS Code) over a sufficiently large field that satisfies: ( ) R c T + d T T + 1 1 R 1 R October 9, 2014 University of Michigan, Ann Arbor 15/ 1
Simulation Result Gilbert-Elliott Channel (α, β) = (5 10 4, 0.5), T = 12 and R 0.5 Gilbert Elliott Channel Good State: Pr(loss) = ε Bad State: Pr(loss) = 1 1- Good Bad Gilbert Channel (, ) = (5x10 4,0.5) Simulation Length = 10 7 1- Probability of Occurence 0.5 0.4 0.3 0.2 0.1 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Burst Length October 9, 2014 University of Michigan, Ann Arbor 16/ 1
Simulation Results Gilbert-Elliott Channel (α, β) = (5 10 4, 0.5), T = 12 and R 0.5 10 1 Gilbert Elliott Channel (, ) = (5E 4,0.5), T = 12 10 2 Loss Probability 10 3 10 4 10 5 1 2 3 4 5 6 7 8 9 10 x 10 3 Code N B Code N B Strongly MDS 6 6 MiDAS 2 9 Burst-Erasure 1 11 October 9, 2014 University of Michigan, Ann Arbor 17/ 1
Simulation Results-II Fritchman Channel (α, β) = (1e 5, 0.5) and T = 40 and R = 40/79, 9 states α Good E 1 E 2 E 3 E N-1 E N 1- α 1-1- 1-1- 1- α = 1e 5 β = 0.5 0.12 Histogram of Burst Lengths for 9 States Fritchman Channel (, ) = (1E 5,0.5) Analytical Actual 0.1 0.08 Probability of Occurence 0.06 0.04 0.02 0 5 10 15 20 25 30 35 40 Burst Length October 9, 2014 University of Michigan, Ann Arbor 18/ 1
Simulation Results-II Fritchman Channel (α, β) = (1e 5, 0.5) and T = 40 and R = 40/79, 9 states 10 2 10 3 Uncoded RLC SCo E RLC (Shift = 32) E RLC (Shift = 36) 9 States Fritchman Channel (, ) = (1e 005,0.5), T = 40 Loss Probability 10 4 10 5 10 6 10 7 1 2 3 4 5 6 7 8 x 10 3 Code N B Code N B Strongly MDS 20 20 MiDAS-I 8 31 Burst Erasure 1 39 MiDAS-II 4 35 October 9, 2014 University of Michigan, Ann Arbor 18/ 1
Multicast Streaming Codes Burst Erasure Broadcast Channel Src. Stream Encoder X[i] B 1 Decoder 1 Delay = T 1 B 2 Decoder 2 Delay = T 2 Motivation B 1 < B 2 Receiver 1 : Good Channel State Receiver 2: Weaker Channel State Delay adapts to Channel State October 9, 2014 University of Michigan, Ann Arbor 19/ 1
Multicast Streaming Codes Burst Erasure Broadcast Channel Src. Stream Encoder X[i] B 1 Decoder 1 Delay = T 1 B 2 Decoder 2 Delay = T 2 Capacity Function Capacity function C(T 1, T 2, B 1, B 2 ) Single User Upper Bound: C min ( T1 T 1 +B 1, Concatenation Lower Bound: C 1 1+ B 1 T 1 + B 2 T 2 ) T 2 T 2 +B 2 October 9, 2014 University of Michigan, Ann Arbor 19/ 1
Multicast Streaming Setup Burst Erasure Broadcast Channel Src. Stream Encoder X[i] B 1 Decoder 1 Delay = T 1 B 2 Decoder 2 Delay = T 2 Capacity Function Capacity function C(T 1, T 2, B 1, B 2 ) Single User Upper Bound: C min ( T1 T 1 +B 1, Concatenation Lower Bound: C 1 1+ B 1 T 1 + B 2 T 2 ) T 2 T 2 +B 2 October 9, 2014 University of Michigan, Ann Arbor 20/ 1
Multicast Streaming Capacity Badr-Khisti-Lui (IT Trans. To Appear 2014) Assume w.l.o.g. B 2 B 1 T 2 B T + 2 2 = T1 B1 B1 Region A Capacity T2 T + B 2 2 B C B 1 + B 2 E (B 1, B 2 ) DE-SCo B 2 D T 2 =T 1 +B 1 T 2 =T 1 SCo A T 1 B C D E T1 T + B 2 1 T1 T + B 1 1 Partial Characterization 1 T2 B1 T B + B 2 2 October 9, 2014 University of Michigan, Ann Arbor 21/ 1
Other Extensions Mismatched Streaming Codes (Patil-Badr-Khisti-Tan Asilomar 2013) Partial Recovery Streaming Codes (Badr-Khisti-Tan-Apostolopoulos JSTSP 2014) Multiple Erasure Bursts (Li-Khisti-Girod Asilomar 2011) - Interleaved Low-Delay Codes Multiple Links (Lui-Badr-Khisti CWIT 2011) - Layered coding for burst erasure channels Multiple Source Streams with Different Decoding Delays (Lui (Unpublished) 2011) - Embedded Codes Other Results Burst Erasure Channels: Martinian and Sundberg (IT-2004) Other Recent Results: Leong-Ho (ISIT 2012), Leong-Qureshi-Ho (ISIT 2013) October 9, 2014 University of Michigan, Ann Arbor 22/ 1
Outline Real-Time Streaming Communication Streaming Error Correction Streaming Compression Collaborators: Ahmed Badr (Toronto) John Apostolopoulos (Cisco) Wai-Tian Tan (Cisco) Collaborators: Farrokh Etezadi (Toronto) Mitchell Trott October 9, 2014 University of Michigan, Ann Arbor 23/ 1
Compression Vs Error Propagation GOP Picture Structure 1 Compression Predictive Coding Still Image Coding Error Propagation Interleaving Approach Error Control Coding 1 Source : http://www.networkwebcams.com October 9, 2014 University of Michigan, Ann Arbor 24/ 1
Motivation - Video Streaming Compression Efficiency Predictive Coding Still Image Compression Robustness to Error Propagation October 9, 2014 University of Michigan, Ann Arbor 25/ 1
Motivation - Video Streaming Compression Efficiency Predictive Coding MPEG Still Image Compression Robustness to Error Propagation October 9, 2014 University of Michigan, Ann Arbor 25/ 1
Motivation - Video Streaming Compression Efficiency Predictive Coding MPEG Fundamental Tradeoff Still Image Compression Robustness to Error Propagation October 9, 2014 University of Michigan, Ann Arbor 25/ 1
Information Theoretic Model Src Packets s 1 s 2 s 3 s 4 s 5 s 6 s 7 s 8 Compressed Packets f 1 f 2 f 3 f 4 f 5 f 6 f 7 f 8 Channel f 1 f 2 f 3 * f 5 f 6 f 7 f 8 Reproduced Stream s 1 s 2 s 3 S 4 S 5 S 6 s 7 s 8 System Parameters: Compression Rate: R Erasure Burst Length: B Recovery Window: W Lossless or Lossy Recovery B=1 W=2 October 9, 2014 University of Michigan, Ann Arbor 26/ 1
Problem Setup Source Model: Sequence of vectors Temporally Markov, Spatially i.i.d. (see Viswanathan and Berger ( 00), Wang and Xu ( 10), Song-Chen-Wang-Liu ( 13), Ma-Ishwar ( 11)) Pr(s n i s n i 1, s n i 2,..., ) = n Pr(s ij s i 1,j ) j=1 Channel Model: { Burst Erasure Model f i, i / {j, j + 1,..., j + B 1} g i =, otherwise October 9, 2014 University of Michigan, Ann Arbor 27/ 1
Problem Setup Source Model: Sequence of vectors Temporally Markov, Spatially i.i.d. (see Viswanathan and Berger ( 00), Wang and Xu ( 10), Song-Chen-Wang-Liu ( 13), Ma-Ishwar ( 11)) Pr(s n i s n i 1, s n i 2,..., ) = n Pr(s ij s i 1,j ) j=1 Channel Model: { Burst Erasure Model f i, i / {j, j + 1,..., j + B 1} g i =, otherwise Encoder: F i : {s n 0,..., sn i } f i {1, 2,..., 2 nr }. Decoder: G i : {g 0,..., g i } ŝ n i Lossless Recovery: Pr(si n ŝi n ) ε n except for i [j,..., j + B + W 1]. October 9, 2014 University of Michigan, Ann Arbor 27/ 1
Rate-Recovery Function Definition (Rate-Recovery Function) The minimum compression rate R that is achieved when: Burst-Erasure Length = B Recovery Window = W Lossless or Lossy Recovery s n 0 s n 1 s n 2 s n j 1 s n j s n j+1 s n j+b 1 s n j+b s n j+b+w 1 s n j+b+w f 0 f 1 f 2 f j 1 f j f j+1 f j+b 1 f j+b f j+b+w 1 f j+b+w f 0 f 1 f 2 f j 1 f j+b f j+b+w 1 f j+b+w ŝ n 0 ŝ n 1 ŝ n 2 ŝ n j 1 ŝ n j+b+w Erased Not to be recovered Error Propagation Window October 9, 2014 University of Michigan, Ann Arbor 28/ 1
Rate-Recovery Function Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) s n 0 s n 1 s n 2 s n j 1 s n j s n j+1 s n j+b 1 s n j+b s n j+b+w 1 s n j+b+w f 0 f 1 f 2 f j 1 f j f j+1 f j+b 1 f j+b f j+b+w 1 f j+b+w f 0 f 1 f 2 f j 1 f j+b f j+b+w 1 f j+b+w ŝ n 0 ŝ n 1 ŝ n 2 ŝ n j 1 Erased Not to be recovered Error Propagation Window Theorem (Upper and Lower Bounds - Lossless Case) ŝ n j+b+w R + (B, W ) = H(s 1 s 0 ) + 1 W + 1 I(s B; s B 1 s 1 ) R (B, W ) = H(s 1 s 0 ) + 1 W + 1 I(s B+W ; s B 1 s 1 ) Upper bound : Binning based scheme. Upper and Lower Bounds Coincide: W = 0 and W. October 9, 2014 University of Michigan, Ann Arbor 29/ 1
Lower Bound - Rate Recovery Function Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) Let W = 1. Encoding of s n j, sn j+1 sn j 1 s n j, sn j+1 Encoder f j f j+1 Decoder j Decoder j+1 ŝ n j ŝ n j+1 s n j B 1 October 9, 2014 University of Michigan, Ann Arbor 30/ 1
Lower Bound - Rate Recovery Function Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) Let W = 1. Encoding of s n j, sn j+1 sn j 1, sn j+1 s n j, sn j+1 Encoder f j f j+1 Decoder j Decoder j+1 ŝ n j ŝ n j+1 s n j B 1 October 9, 2014 University of Michigan, Ann Arbor 30/ 1
Lower Bound - Rate Recovery Function Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) Let W = 1. Encoding of s n j, sn j+1 sn j 1, sn j+1 s n j, sn j+1 Encoder f j f j+1 Decoder j Decoder j+1 ŝ n j ŝ n j+1 s n j B 1 Lower Bound: R j + R j+1 H(s j s j 1, s j+1 ) + H(s j+1 s j B 1 ). October 9, 2014 University of Michigan, Ann Arbor 30/ 1
Gauss-Markov Source Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) Stationary Gauss-Markov Source, s i N (0, 1). s i = ρs i 1 + z i, z i N (0, 1 ρ 2 ) {s j } j<i October 9, 2014 University of Michigan, Ann Arbor 31/ 1
Gauss-Markov Source Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) Stationary Gauss-Markov Source, s i N (0, 1). s i = ρs i 1 + z i, z i N (0, 1 ρ 2 ) {s j } j<i Quadratic Distortion Measure: [ ] 1 n E (s i (k) ŝ i (k)) 2 D, n k=1 October 9, 2014 University of Michigan, Ann Arbor 31/ 1
Gauss-Markov Source Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) Stationary Gauss-Markov Source, s i N (0, 1). s i = ρs i 1 + z i, z i N (0, 1 ρ 2 ) {s j } j<i Quadratic Distortion [ Measure: ] 1 n E (s i (k) ŝ i (k)) 2 D, n k=1 No Recovery Period, W = 0. s 0 s 1 s j 1 s j s j+1 s j+b 1 s j+b f 0 f 1 f j 1 f j f j+1 f j+b 1 f j+b f 0 f 1 f j 1 f j+b ŝ 0 ŝ 1 ŝ j 1 Erased ŝ j+b October 9, 2014 University of Michigan, Ann Arbor 31/ 1
Gauss-Markov Source Etezadi-Khisti-Trott (IEEE Trans. Information Theory, Aug. 2014) Stationary Gauss-Markov Source, s i N (0, 1). s i = ρs i 1 + z i, z i N (0, 1 ρ 2 ) {s j } j<i Quadratic Distortion Measure: [ ] 1 n E (s i (k) ŝ i (k)) 2 D, n k=1 No Recovery Period, W = 0 Lossy Rate Recovery Function R(B, D) High Resolution Limit R(B, D) = 1 2 log ( 1 ρ 2(B+1) D Upper and Lower bounds. W > 0: Novel Hybrid Coding Schemes ) + o D (1). October 9, 2014 University of Michigan, Ann Arbor 31/ 1
Gauss-Markov Sources W > 0 (In Progress) Predictive Coding: u i = s i ŝ i (u i 1 ) + n i Memoryless Quantize and Binning: u i = s i + n i Truncated Prediction (Hybrid): u i = s i ŝ i (u i i W ) + n i s n 0 s n 1 s n 2 s n j 1 s n j s n j+1 s n j+b 1 s n j+b s n j+b+w 1 s n j+b+w f 0 f 1 f 2 f j 1 f j f j+1 f j+b 1 f j+b f j+b+w 1 f j+b+w f 0 f 1 f 2 f j 1 f j+b f j+b+w 1 f j+b+w ŝ n 0 ŝ n 1 ŝ n 2 ŝ n j 1 ŝ n j+b+w Erased Not to be recovered Error Propagation Window October 9, 2014 University of Michigan, Ann Arbor 32/ 1
Gauss-Markov Sources W > 0 (In Progress) Predictive Coding: u i = s i ŝ i (u i 1 ) + n i Memoryless Quantize and Binning: u i = s i + n i Truncated Prediction (Hybrid): u i = s i ŝ i (u i i W ) + n i October 9, 2014 University of Michigan, Ann Arbor 32/ 1
Block Fading Channels w 1 w 2 w 3 w 4 w 5 w 6 h 1 h 2 h 3 h 4 h 5 h 6 w 1 w 2 w 3 w 4 w 5 Block Fading Channel y i = h i x i + z i, i = 1, 2,... Block Fading Channels: n symbols per block Source Packet: One in each coherence block nr bits Decoding Delay: T coherence blocks Quasi-Static Model: T = 1 Ergodic Model T October 9, 2014 University of Michigan, Ann Arbor 33/ 1
Streaming DMT of Fading Channels Theorem (Khisti-Draper, IT-Trans To Appear, 2014) The diversity multiplexing tradeoff for streaming source with a delay of T coherence blocks and a block-fading channel model is d(r) = T d 1 (r) where d 1 (r) is the quasi-static DMT. Coding Scheme: Time-Sharing Upper Bound: Outage Amplification Argument October 9, 2014 University of Michigan, Ann Arbor 34/ 1
Achievability: Fading Channels T = 2 T parallel channel code Divide each coherence block into T sub-blocks Apply parallel channel code across T sub-blocks w 1 w 2 w 3 w 4 w 5 w 6 h 1 h 2 h 3 h 4 h 5 h 6 w 1 w 2 w 3 w 4 w 5 October 9, 2014 University of Michigan, Ann Arbor 35/ 1
Achievability: Fading Channels T = 2 T parallel channel code Divide each coherence block into T sub-blocks Apply parallel channel code across T sub-blocks w 1 w 2 w 3 w 4 w 5 w 6 X 1 X 2 X 3 X 4 X 5 X 6 w 1 w 2 w 3 w 4 w 5 October 9, 2014 University of Michigan, Ann Arbor 35/ 1
Achievability: Fading Channels T = 2 T parallel channel code Divide each coherence block into T sub-blocks Apply parallel channel code across T sub-blocks w 1 w 2 w 3 w 4 w 5 w 6 X 1 Y 0 X 2 Y 1 X 3 Y 2 X 4 Y 3 X 5 Y 4 X 6 Y 5 w 1 w 2 w 3 w 4 w 5 October 9, 2014 University of Michigan, Ann Arbor 35/ 1
Achievability: Fading Channels T = 2 T parallel channel code Divide each coherence block into T sub-blocks Apply parallel channel code across T sub-blocks w 1 w 2 w 3 w 4 w 5 w 6 X 1 Y 0 X 2 Y 1 X 3 Y 2 X 4 Y 3 X 5 Y 4 X 6 Y 5 w 1 w 2 w 3 w 4 w 5 DMT: d(r) = 2 2r October 9, 2014 University of Michigan, Ann Arbor 35/ 1
Physical Layer Security October 9, 2014 University of Michigan, Ann Arbor 36/ 1
Secret-Key Generation over Fading Channels Khisti, IT-Trans Submitted Aug. 2013 m A A Forward Link A BA y = h x + n B B y = h x + n BA AB A AB Reverse Link B m B K A z + AE = g AE xa nae z BE = g BE xb + nbe K B E Two-way block-fading channel Coherence Period: T Channel Reciprocity Non Coherent Main-Channel, Perfect CSI for eavesdropper Interactive Communication October 9, 2014 University of Michigan, Ann Arbor 37/ 1
Secret-Key Generation over Fading Channels Khisti, IT-Trans Submitted Aug. 2013 m A A Forward Link A BA y = h x + n B B y = h x + n BA AB A AB Reverse Link B m B K A z + AE = g AE xa nae z BE = g BE xb + nbe K B E High SNR Capacity: C 1 [ ( T I(h AB; h BA ) + E log 1 + h AB 2 )] [ ) g A 2 + E log (1 + h BA 2 )] g B 2 ) Optimal Scheme: Training + Interactive Communication October 9, 2014 University of Michigan, Ann Arbor 37/ 1
Secure Communication - Wiretap Channel Secure Multi-Antenna Multicast (A. Khisti IT-2011, A. Khisti and D. Zhang Comm Letters 2013) Practical Codes for MIMO Wiretap Channel (A. Khina and Y. Kochman ISIT 2014) V-Blast Type Decomposition Scalar AWGN Wiretap Codebooks MIMO Arbitrarily Varying Wiretap Channel (A. Yener and X. He, IT-Trans 2013, T-COMM 2014) Fading Wiretap Channel with Imperfect CSI (Z. Rezki and M. S. Alouini, T-COMM 2014, T-Wireless 2014) Private Broadcasting (T. Liu IT-Trans 2014) Secure Broadcasting with Independent Secret Keys (R. Schaefer, CISS 2014) October 9, 2014 University of Michigan, Ann Arbor 38/ 1
Conclusions Real-Time Streaming Communication Part I: Channel Coding Channels with Burst and Isolated Erasures Explicit Codes New Distance and Span Metrics Part II: Source Coding Tradeoff between compression rate and error propagation Lossless Recovery Gaussian Sources with Quadratic Distortion October 9, 2014 University of Michigan, Ann Arbor 39/ 1