Efficiently decodable codes for the binary deletion channel
|
|
- Hilda Johnson
- 5 years ago
- Views:
Transcription
1 Efficiently decodable codes for the binary deletion channel Venkatesan Guruswami Ray Li * (rayyli@stanford.edu) Carnegie Mellon University August 18, 2017 V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
2 Outline for section 1 1 Introduction 2 Binary deletion channel background 3 Construction ingredients 4 Construction 5 Open questions V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
3 Deletion channel m = 01 Alice V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
4 Deletion channel m = 01 Alice Bob V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
5 Deletion channel m = 01 Alice Channel Bob V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
6 Deletion channel 01 1 m = 01 Alice Channel Bob V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
7 Deletion channel 01 1 m = 01 Alice Channel Bob m = 11 m = 10 m = 01 V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
8 Deletion channel 01 1 m = 01 Alice Channel Bob m = 11 m = 10 m = 01 Error type Before After Substitution Erasure ? Deletion V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
9 Error correcting codes m = 01 Alice (Enc) Bob (Dec) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
10 Error correcting codes m = 01 Alice (Enc) Channel Bob (Dec) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
11 Error correcting codes m = 01 Alice (Enc) Channel Bob (Dec) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
12 Error correcting codes m = 01 Alice (Enc) Channel Bob (Dec) m = 01 V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
13 Error correcting codes m = 01 Alice (Enc) Channel Bob (Dec) m = 01 Tradeoff between redundancy and robustness of the code. Efficient construction, encoding, decoding. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
14 Error correcting codes m = 01 Alice (Enc) Channel Bob (Dec) m = 01 Tradeoff between redundancy and robustness of the code. Efficient construction, encoding, decoding. This work V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
15 Error correcting codes m = 01 Alice (Enc) Channel Bob (Dec) m = 01 Tradeoff between redundancy and robustness of the code. Efficient construction, encoding, decoding. This work Note: Order of redundancy matters for deletions. m mmm fails for 1 deletion ( , ). V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
16 Error correcting codes: Notation m = 01 Alice (Enc) c = s = Channel Bob (Dec) m = 01 Alphabet: Σ (e.g. {0, 1}) (Block) length: n, m, N Codeword: c Σ n V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
17 Error correcting codes: Notation m = 01 Alice (Enc) c = s = Channel Bob (Dec) m = 01 Alphabet: Σ (e.g. {0, 1}) (Block) length: n, m, N Codeword: c Σ n Rate R = log(#messages) N (0, 1). R is proportion of non-redundant symbols Want families of codes (implicitly N ) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
18 Binary deletion channel m = 01 Alice (Enc) c = s = Channel Bob (Dec) m = 01 Adversarial: # deletions fixed (pn), decoding 100% success Random: i.i.d deletions, decoding success w.h.p. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
19 Binary deletion channel m = 01 Alice (Enc) c = s = Channel Bob (Dec) m = 01 Adversarial: # deletions fixed (pn), decoding 100% success Random: i.i.d deletions, decoding success w.h.p. This work Definition For p (0, 1), the binary deletion channel with deletion probability p (BDC p ) deletes bits of binary strings independently w.p. p. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
20 Outline for section 2 1 Introduction 2 Binary deletion channel background 3 Construction ingredients 4 Construction 5 Open questions V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
21 Capacity Definition Capacity C BDC (p) = sup{r : exists rate R code family against BDC p }. Question. What is the capacity of a binary deletion channel with deletion probability p? V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
22 Capacity Definition Capacity C BDC (p) = sup{r : exists rate R code family against BDC p }. Question. What is the capacity of a binary deletion channel with deletion probability p? Binary Symmetric Channel (BSC): Well understood Binary Erasure Channel (BEC): Well understood Binary Deletion Channel (BDC): Don t know capacity Error type Before After Substitution Erasure ? Deletion V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
23 Capacity Definition Capacity C BDC (p) = sup{r : exists rate R code family against BDC p }. Question. What is the capacity of a binary deletion channel with deletion probability p? Binary Symmetric Channel (BSC): Well understood Binary Erasure Channel (BEC): Well understood Binary Deletion Channel (BDC): Don t know capacity Error type Before After Substitution Erasure ? Deletion Aside: Adversarial deletions, unknown if capacity is 0 for p [ 2 1, 1 2 ). V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
24 Existing bounds on BDC p capacity Definition Capacity C BDC (p) = sup{r : exists rate R code family against BDC p }. Recall: p [0, 1], H(p) = p log 1 p + (1 p) log 1 1 p. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
25 Existing bounds on BDC p capacity Definition Capacity C BDC (p) = sup{r : exists rate R code family against BDC p }. Recall: p [0, 1], H(p) = p log 1 p Lower bounds 1 2H(p) (adversarial bound) 1 H(p) [Gallager 61, Zigangirov 69] (1 p)/9 [Drinea, Mitzenmacher 06] Upper bounds + (1 p) log 1 1 p. 1 H(p) as p 0 [Kalai, Mitzenmacher, Sudan 10] 1 p (BEC Capacity).4143(1 p) if p 0.65 [Rahmati, Duman 15] Numerical bounds [Fertonani, Duman 10] a Capacity understood as p 0 (1 H(p)), p 1 (α(1 p)). V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
26 Existing bounds on BDC p capacity Definition Capacity C BDC (p) = sup{r : exists rate R code family against BDC p }. Recall: p [0, 1], H(p) = p log 1 p Lower bounds 1 2H(p) (adversarial bound) 1 H(p) [Gallager 61, Zigangirov 69] (1 p)/9 [Drinea, Mitzenmacher 06] + (1 p) log 1 1 p. a Capacity understood as p 0 (1 H(p)), p 1 (α(1 p)). Upper bounds 1 H(p) as p 0 [Kalai, Mitzenmacher, Sudan 10] 1 p (BEC Capacity).4143(1 p) if p 0.65 [Rahmati, Duman 15] Numerical bounds [Fertonani, Duman 10] (1 p)/9 result is non-constructive V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
27 Existing bounds on BDC p capacity Lower bounds 1 H(p) [Gallager 61, Zigangirov 69] (1 p)/9 [Drinea, Mitzenmacher 06] Upper bounds.4143(1 p) if p 0.65 [Rahmati, Duman 15] Numerical bounds [Fertonani, Duman 10] *Plot made with Mathematica V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
28 New algorithmic result Theorem (Guruswami, Li 17) Let p (0, 1). There is a constant α > 0 and an explicit a family of binary codes that has rate α(1 p), is constructible, encodable, decodable in poly time on BDC p. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
29 New algorithmic result Theorem (Guruswami, Li 17) Let p (0, 1). There is a constant α > 0 and an explicit a family of binary codes that has rate α(1 p), (α = ) is constructible, encodable, decodable in poly time on BDC p. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
30 New algorithmic result Theorem (Guruswami, Li 17) Let p (0, 1). There is a constant α > 0 and an explicit a family of binary codes that has rate α(1 p), (α = ) is constructible, encodable, decodable in poly time on BDC p. Throughout the talk, think of p close to 1. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
31 Outline for section 3 1 Introduction 2 Binary deletion channel background 3 Construction ingredients Concatenated codes Haeupler-Shahrasbi code 4 Construction 5 Open questions V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
32 Ingredient 1: Concatenated codes Example Enc out (m) = aabc C out Enc in ( ) : a 0011, b 1100, c 0101 Enc(m) = C m a a b c C out C V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
33 Ingredient 1: Concatenated codes C out : alphabet K, length n C in : alphabet k, length m C: alphabet k, length N = mn Message encoded into c = σ1 σ 2... σ n C out Each σi encoded into codeword of C in R = R out R in m σ 1 σ 2... σ n C out C in (σ 1 ) C in (σ 2 )... C in (σ n ) C V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
34 Ingredient 2: What do we want in an outer code? m σ 1 σ 2... σ n C out C in (σ 1 ) C in (σ 2 )... C in (σ n ) C Rate close to 1 Tolerate constant fraction of adversarial insertions and deletions Fast construction, encoding, decoding V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
35 Ingredient 2: Haeupler-Shahrasbi code (STOC 17) m σ 1 σ 2... σ n C out C in (σ 1 ) C in (σ 2 )... C in (σ n ) C Σ = poly(1/ɛ), Rate ɛ, Corrects 0.001n adversarial insertions and deletions Construction time: poly(n) Encoding time: O(n) Decoding time: O(n 2 ) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
36 Outline for section 4 1 Introduction 2 Binary deletion channel background 3 Construction ingredients 4 Construction First attempts Our construction Remarks on construction 5 Open questions V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
37 Reminder: New algorithmic result Theorem (Guruswami, Li 17) Let p (0, 1). There is a constant α > 0 and an explicit a family of binary codes that has rate α(1 p), is constructible, encodable, decodable in polynomial time on BDC p. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
38 Attempt 1: Vanilla concatenation C out is Haeupler-Shahrasbi code. C in {0, 1} m robust against BDC p m σ 1 σ 2... σ n C in (σ 1 ) C in (σ 2 )... C in (σ n ) Rate 0.999R in Outer code decodes in O(n 2 ) time V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
39 Attempt 1: Vanilla concatenation C out is Haeupler-Shahrasbi code. C in {0, 1} m robust against BDC p (m = poly(1/ɛ)) m σ 1 σ 2... σ n C in (σ 1 ) C in (σ 2 )... C in (σ n ) Rate 0.999R in Outer code decodes in O(n 2 ) time Inner code decodes in constant time V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
40 Attempt 1: Vanilla concatenation C out is Haeupler-Shahrasbi code. C in {0, 1} m robust against BDC p (m = poly(1/ɛ)) m σ 1 σ 2... σ n C in (σ 1 ) C in (σ 2 )... C in (σ n ) Rate 0.999R in Outer code decodes in O(n 2 ) time Inner code decodes in constant time Issue: where do inner codewords start? V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
41 Attempt 1: Vanilla concatenation C out is Haeupler-Shahrasbi code. C in {0, 1} m robust against BDC p (m = poly(1/ɛ)) m σ 1 σ 2... σ n C in (σ 1 ) C in (σ 2 )... C in (σ n ) Rate 0.999R in Outer code decodes in O(n 2 ) time Inner code decodes in constant time Issue: where do inner codewords start? V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
42 Attempt 1: Vanilla concatenation C out is Haeupler-Shahrasbi code. C in {0, 1} m robust against BDC p (m = poly(1/ɛ)) m σ 1 σ 2... σ n C in (σ 1 ) C in (σ 2 )... C in (σ n ) Rate 0.999R in Outer code decodes in O(n 2 ) time Inner code decodes in constant time Issue: where do inner codewords start? V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
43 Attempt 2: Concatenation with buffers C out is Haeupler-Shahrasbi code. C in {0, 1} m robust against BDC p (m = poly(1/ɛ)) m σ 1 C in (σ 1 ) m Decoding σ 2 C in (σ 2 ) m m σ n Identify decoding buffers as long runs of 0s C in (σ n ) Inner decode the decoding windows in between the decoding buffers V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
44 Attempt 2: Concatenation with buffers C out is Haeupler-Shahrasbi code. C in {0, 1} m robust against BDC p (m = poly(1/ɛ)) m σ 1 C in (σ 1 ) m Decoding σ 2 C in (σ 2 ) m m σ n Identify decoding buffers as long runs of 0s C in (σ n ) Inner decode the decoding windows in between the decoding buffers Bits of inner codewords not deleted according to BDC p Cannot use DM06 R = (1 p)/9 code as a black box inner code V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
45 Concatenation with buffers and duplication σ 1 C B in (σ 1) Bm σ 2 C B in (σ 2) m Bm Bm σ n C B in (σ n) Outer code. Haeupler-Shahrasbi code Inner code. Word have runs of length 1 and 2 only, start and end with 1, chosen greedily, correct against δn adversarial deletions Buffer m 0s between adjacent inner codewords. Duplication. Duplicate each bit B(p) times (B(p) = 60/(1 p)) 1 B 0 B 1 2B 0 2B 1 2B 0 B 1 B 0 B 1 B 0 B 1 2B 0 2B 1 B V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
46 Concatenation with buffers and duplication σ 1 C B in (σ 1) Bm σ 2 C B in (σ 2) m Bm Bm σ n C B in (σ n) Outer code. Haeupler-Shahrasbi code Inner code. Word have runs of length 1 and 2 only, start and end with 1, chosen greedily, correct against δn adversarial deletions. δ 0.008, R in Buffer m 0s between adjacent inner codewords. Duplication. Duplicate each bit B(p) times (B(p) = 60/(1 p)) 1 B 0 B 1 2B 0 2B 1 2B 0 B 1 B 0 B 1 B 0 B 1 2B 0 2B 1 B V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
47 Concatenation with buffers and duplication σ 1 C B in (σ 1) Bm σ 2 C B in (σ 2) m Bm Bm σ n C B in (σ n) Outer code. Haeupler-Shahrasbi code Inner code. Word have runs of length 1 and 2 only, start and end with 1, chosen greedily, correct against δn adversarial deletions. δ 0.008, R in Buffer m 0s between adjacent inner codewords. Duplication. Duplicate each bit B(p) times (B(p) = 60/(1 p)) 1 B 0 B 1 2B 0 2B 1 2B 0 B 1 B 0 B 1 B 0 B 1 2B 0 2B 1 B Rate (1 p)/60 = α(1 p). V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
48 Concatenation with buffers and duplication σ 1 C B in (σ 1) Bm σ 2 C B in (σ 2) m Bm Bm σ n C B in (σ n) Outer code. Haeupler-Shahrasbi code Inner code. Runs of length 1 and 2. Start and end with Buffer. ηm 0s between inner codewords. Duplication. B = 60/(1 p). E[ BDC p (1 B ) ] = 60, E[ BDC p (1 2B ) ] = 120 Idea: Decode 1 α as 1 for α 86, and 11 for α > 86 V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
49 Decoding algorithm aaa m σ 1 σ 2 σ n C 60 in (σ 1) m Decoding algorithm. C 60 in (σ 2) m m C 60 in (σ n) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
50 Decoding algorithm aaa m σ 1 σ 2 σ n C 60 in (σ 1) m Decoding algorithm. C 60 in (σ 2) m m C 60 in (σ n) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
51 Decoding algorithm aaa m σ 1 σ 2 σ n C 60 in (σ 1) m Decoding algorithm. C 60 in (σ 2) m m C 60 in (σ n) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
52 Decoding algorithm aaa m σ 1 C in (σ 1 ) m σ 2 Decoding algorithm. C in (σ 2 ) m m σ n C in (σ n ) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
53 Decoding algorithm aaa m σ 1 C in (σ 1 ) m σ 2 Decoding algorithm. C in (σ 2 ) m m σ n C in (σ n ) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
54 Decoding algorithm aaa m σ 1 C in (σ 1 ) m σ 2 Decoding algorithm. C in (σ 2 ) m m σ n C in (σ n ) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
55 Decoding algorithm aaa m σ 1 C in (σ 1 ) m σ 2 Decoding algorithm. C in (σ 2 ) m m σ n C in (σ n ) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in Use Dec out to decode σ 1 σ 2... σ n into message m. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
56 Decoding algorithm aaa m σ 1 C in (σ 1 ) m σ 2 Decoding algorithm. C in (σ 2 ) m m σ n C in (σ n ) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in Use Dec out to decode σ 1 σ 2... σ n into message m. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
57 Decoding algorithm aaa m σ 1 C in (σ 1 ) m σ 2 Decoding algorithm. C in (σ 2 ) m m σ n C in (σ n ) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows O(n) Deduplicate the runs: b α as b for α 86, and bb for α > 86 n O(1) For each decoding window recover outer symbol σ from Dec in n O(1) Use Dec out to decode σ 1 σ 2... σ n into message m. O(n 2 ) Total runtime: O(n 2 ) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
58 What could go wrong? Deleted buffer 2 deletions, 1 insertion m σ 1 σ 1 σ 2 σ n 1 C 60 in (σ 1) m Decoding algorithm. C 60 in (σ 2) m m C 60 in (σ n) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in Use Dec out to decode σ 1 σ 2... σ n into message m. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
59 What could go wrong? Created buffer 2 insertions, 1 deletion m σ 1 σ 2 σ 3 σ n+1 x0 0.06m y m Decoding algorithm. C 60 in (σ 2) m m C 60 in (σ n) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in Use Dec out to decode σ 1 σ 2... σ n into message m. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
60 What could go wrong? Corrupted decoding window 1 insertion, 1 deletion m σ 1 x 0 σ m C 60 Decoding algorithm. in (σ 2) m m σ n C 60 in (σ n) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in Use Dec out to decode σ 1 σ 2... σ n into message m. V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
61 What could go wrong? Corrupted decoding window 1 insertion, 1 deletion m σ 1 x 0 σ m C 60 Decoding algorithm. in (σ 2) m m σ n C 60 in (σ n) Runs of at least 0.03m zeros (decoding buffers) divide word into decoding windows Deduplicate the runs: b α as b for α 86, and bb for α > 86 For each decoding window recover outer symbol σ from Dec in Use Dec out to decode σ 1 σ 2... σ n into message m. Total number of ins/dels in σ 1... σ n is < 0.001n w.h.p V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
62 Remark: Rate improvement Rate is 1 p 1 p 110. Can improve to 60 if duplication is Poisson No easy way to use 1 p 9 directly as a black box inner code (else rate is 1 p 9 ) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
63 Remark: Alternative outer codes Reed Solomon code (encoding (i, α i ) into the inner code) Similar rate, Worse runtime, Inner code is log n length High rate binary code efficiently decodable against insertions and deletions [Guruswami, Li 16] Worse rate and runtime Outer code Error type Inner len Rate Error frac Decoding HS 17 Ins/del c 1 ɛ Ω(ɛ) O(N 2 ) GL 16 Ins/del c 1 ɛ Ω(ɛ 5 ) poly(n) RS Erase/Sub Ω(log n) 1 ɛ Ω(ɛ) O(N 3 ) V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
64 Outline for section 5 1 Introduction 2 Binary deletion channel background 3 Construction ingredients 4 Construction 5 Open questions V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
65 Open questions Capacity of the binary deletion channel Efficiently decodable codes for BDC with rate α(1 p) for larger α, perhaps α 1/9 Efficiently decodable codes for BDC with rate 1 O(H(p)) for p 0 Capacity of random channels applying insertions and deletions V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
66 Thank you! Research supported in part by NSF grant CCF V. Guruswami and R. Li (CMU) Efficiently decodable codes for the BDC August 18, / 31
Correcting Localized Deletions Using Guess & Check Codes
55th Annual Allerton Conference on Communication, Control, and Computing Correcting Localized Deletions Using Guess & Check Codes Salim El Rouayheb Rutgers University Joint work with Serge Kas Hanna and
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationError Correcting Codes Questions Pool
Error Correcting Codes Questions Pool Amnon Ta-Shma and Dean Doron January 3, 018 General guidelines The questions fall into several categories: (Know). (Mandatory). (Bonus). Make sure you know how to
More informationTowards Optimal Deterministic Coding for Interactive Communication
Towards Optimal Deterministic Coding for Interactive Communication Ran Gelles Bernhard Haeupler Gillat Kol Noga Ron-Zewi Avi Wigderson Abstract We show an efficient, deterministic interactive coding scheme
More informationGuess & Check Codes for Deletions, Insertions, and Synchronization
Guess & Check Codes for Deletions, Insertions, and Synchronization Serge Kas Hanna, Salim El Rouayheb ECE Department, Rutgers University sergekhanna@rutgersedu, salimelrouayheb@rutgersedu arxiv:759569v3
More informationLocal correctability of expander codes
Local correctability of expander codes Brett Hemenway Rafail Ostrovsky Mary Wootters IAS April 4, 24 The point(s) of this talk Locally decodable codes are codes which admit sublinear time decoding of small
More informationNotes 10: List Decoding Reed-Solomon Codes and Concatenated codes
Introduction to Coding Theory CMU: Spring 010 Notes 10: List Decoding Reed-Solomon Codes and Concatenated codes April 010 Lecturer: Venkatesan Guruswami Scribe: Venkat Guruswami & Ali Kemal Sinop DRAFT
More informationLecture 4: Codes based on Concatenation
Lecture 4: Codes based on Concatenation Error-Correcting Codes (Spring 206) Rutgers University Swastik Kopparty Scribe: Aditya Potukuchi and Meng-Tsung Tsai Overview In the last lecture, we studied codes
More informationExplicit List-Decodable Codes with Optimal Rate for Computationally Bounded Channels
Explicit List-Decodable Codes with Optimal Rate for Computationally Bounded Channels Ronen Shaltiel 1 and Jad Silbak 2 1 Department of Computer Science, University of Haifa, Israel ronen@cs.haifa.ac.il
More informationExplicit Capacity Approaching Coding for Interactive Communication
Explicit Capacity Approaching Coding for Interactive Communication Ran Gelles Bernhard Haeupler Gillat Kol Noga Ron-Zewi Avi Wigderson Abstract We show an explicit that is, efficient and deterministic
More informationGuess & Check Codes for Deletions, Insertions, and Synchronization
Guess & Chec Codes for Deletions, Insertions, and Synchronization Serge Kas Hanna, Salim El Rouayheb ECE Department, IIT, Chicago sashann@hawiitedu, salim@iitedu Abstract We consider the problem of constructing
More informationfor some error exponent E( R) as a function R,
. Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationLecture Introduction. 2 Formal Definition. CS CTT Current Topics in Theoretical CS Oct 30, 2012
CS 59000 CTT Current Topics in Theoretical CS Oct 30, 0 Lecturer: Elena Grigorescu Lecture 9 Scribe: Vivek Patel Introduction In this lecture we study locally decodable codes. Locally decodable codes are
More informationCorrecting Bursty and Localized Deletions Using Guess & Check Codes
Correcting Bursty and Localized Deletions Using Guess & Chec Codes Serge Kas Hanna, Salim El Rouayheb ECE Department, Rutgers University serge..hanna@rutgers.edu, salim.elrouayheb@rutgers.edu Abstract
More informationatri/courses/coding-theory/book/
Foreword This chapter is based on lecture notes from coding theory courses taught by Venkatesan Guruswami at University at Washington and CMU; by Atri Rudra at University at Buffalo, SUNY and by Madhu
More informationarxiv: v1 [cs.it] 23 Feb 2018
Synchronization Strings: List Decoding for Insertions and Deletions Bernhard Haeupler Carnegie Mellon University haeupler@cs.cmu.edu Amirbehshad Shahrasbi Carnegie Mellon University shahrasbi@cs.cmu.edu
More informationLecture 6: Expander Codes
CS369E: Expanders May 2 & 9, 2005 Lecturer: Prahladh Harsha Lecture 6: Expander Codes Scribe: Hovav Shacham In today s lecture, we will discuss the application of expander graphs to error-correcting codes.
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationNear-Optimal Secret Sharing and Error Correcting Codes in AC 0
Near-Optimal Secret Sharing and Error Correcting Codes in AC 0 Kuan Cheng Yuval Ishai Xin Li December 18, 2017 Abstract We study the question of minimizing the computational complexity of (robust) secret
More informationAlgebraic Codes for Error Control
little -at- mathcs -dot- holycross -dot- edu Department of Mathematics and Computer Science College of the Holy Cross SACNAS National Conference An Abstract Look at Algebra October 16, 2009 Outline Coding
More informationLecture 9: List decoding Reed-Solomon and Folded Reed-Solomon codes
Lecture 9: List decoding Reed-Solomon and Folded Reed-Solomon codes Error-Correcting Codes (Spring 2016) Rutgers University Swastik Kopparty Scribes: John Kim and Pat Devlin 1 List decoding review Definition
More informationLecture 03: Polynomial Based Codes
Lecture 03: Polynomial Based Codes Error-Correcting Codes (Spring 016) Rutgers University Swastik Kopparty Scribes: Ross Berkowitz & Amey Bhangale 1 Reed-Solomon Codes Reed Solomon codes are large alphabet
More informationLecture 3: Error Correcting Codes
CS 880: Pseudorandomness and Derandomization 1/30/2013 Lecture 3: Error Correcting Codes Instructors: Holger Dell and Dieter van Melkebeek Scribe: Xi Wu In this lecture we review some background on error
More informationExplicit Capacity Approaching Coding for Interactive Communication
Explicit Capacity Approaching Coding for Interactive Communication Ran Gelles Bernhard Haeupler Gillat Kol Ÿ Noga Ron-Zewi Avi Wigderson Abstract We show an explicit that is, ecient and deterministic capacity
More informationError-Correcting Codes:
Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT Communication in presence of noise We are not ready Sender Noisy Channel We are now ready Receiver If information is digital, reliability
More informationarxiv: v1 [cs.it] 24 Nov 2014
Deletion codes in the high-noise and high-rate regimes Venkatesan Guruswami Carol Wang arxiv:1411.6667v1 [cs.it] 24 Nov 2014 Computer Science Department Carnegie Mellon University Pittsburgh, PA Abstract
More informationError Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class
Error Correcting Codes: Combinatorics, Algorithms and Applications Spring 2009 Homework Due Monday March 23, 2009 in class You can collaborate in groups of up to 3. However, the write-ups must be done
More informationLinear Algebra. F n = {all vectors of dimension n over field F} Linear algebra is about vectors. Concretely, vectors look like this:
15-251: Great Theoretical Ideas in Computer Science Lecture 23 Linear Algebra Linear algebra is about vectors. Concretely, vectors look like this: They are arrays of numbers. fig. by Peter Dodds # of numbers,
More informationOn the Zero-Error Capacity Threshold for Deletion Channels
On the Zero-Error Capacity Threshold for Deletion Channels Ian A. Kash, Michael Mitzenmacher, Justin Thaler, and Jonathan Ullman School of Engineering and Applied Sciences Harvard University, Cambridge,
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationList and local error-correction
List and local error-correction Venkatesan Guruswami Carnegie Mellon University 8th North American Summer School of Information Theory (NASIT) San Diego, CA August 11, 2015 Venkat Guruswami (CMU) List
More informationLecture 12: November 6, 2017
Information and Coding Theory Autumn 017 Lecturer: Madhur Tulsiani Lecture 1: November 6, 017 Recall: We were looking at codes of the form C : F k p F n p, where p is prime, k is the message length, and
More informationLecture 4: Proof of Shannon s theorem and an explicit code
CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated
More informationList Decoding of Reed Solomon Codes
List Decoding of Reed Solomon Codes p. 1/30 List Decoding of Reed Solomon Codes Madhu Sudan MIT CSAIL Background: Reliable Transmission of Information List Decoding of Reed Solomon Codes p. 2/30 List Decoding
More informationNotes 7: Justesen codes, Reed-Solomon and concatenated codes decoding. 1 Review - Concatenated codes and Zyablov s tradeoff
Introduction to Coding Theory CMU: Spring 2010 Notes 7: Justesen codes, Reed-Solomon and concatenated codes decoding March 2010 Lecturer: V. Guruswami Scribe: Venkat Guruswami & Balakrishnan Narayanaswamy
More information6.895 PCP and Hardness of Approximation MIT, Fall Lecture 3: Coding Theory
6895 PCP and Hardness of Approximation MIT, Fall 2010 Lecture 3: Coding Theory Lecturer: Dana Moshkovitz Scribe: Michael Forbes and Dana Moshkovitz 1 Motivation In the course we will make heavy use of
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationNon-Malleable Coding Against Bit-wise and Split-State Tampering
Non-Malleable Coding Against Bit-wise and Split-State Tampering Mahdi Cheraghchi 1 and Venkatesan Guruswami 2 1 CSAIL, Massachusetts Institute of Technology mahdi@csail.mit.edu 2 Computer Science Department,
More informationBridging Shannon and Hamming: List Error-Correction with Optimal Rate
Proceedings of the International Congress of Mathematicians Hyderabad, India, 2010 Bridging Shannon and Hamming: List Error-Correction with Optimal Rate Venkatesan Guruswami Abstract. Error-correcting
More informationEfficient Probabilistically Checkable Debates
Efficient Probabilistically Checkable Debates Andrew Drucker MIT Andrew Drucker MIT, Efficient Probabilistically Checkable Debates 1/53 Polynomial-time Debates Given: language L, string x; Player 1 argues
More informationIN this paper, we consider the capacity of sticky channels, a
72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion
More informationMaximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback
Maximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback Klim Efremenko UC Berkeley klimefrem@gmail.com Ran Gelles Princeton University rgelles@cs.princeton.edu Bernhard
More informationNoisy channel communication
Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University
More informationNon-Malleable Coding Against Bit-wise and Split-State Tampering
Non-Malleable Coding Against Bit-wise and Split-State Tampering MAHDI CHERAGHCHI CSAIL MIT Cambridge, MA 02139 VENKATESAN GURUSWAMI Computer Science Department Carnegie Mellon University Pittsburgh, PA
More informationLecture 12. Block Diagram
Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data
More informationECEN 655: Advanced Channel Coding
ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History
More informationCapacity Upper Bounds for the Deletion Channel
Capacity Upper Bounds for the Deletion Channel Suhas Diggavi, Michael Mitzenmacher, and Henry D. Pfister School of Computer and Communication Sciences, EPFL, Lausanne, Switzerland Email: suhas.diggavi@epfl.ch
More informationCS151 Complexity Theory. Lecture 9 May 1, 2017
CS151 Complexity Theory Lecture 9 Hardness vs. randomness We have shown: If one-way permutations exist then BPP δ>0 TIME(2 nδ ) ( EXP simulation is better than brute force, but just barely stronger assumptions
More informationRelaxed Locally Correctable Codes in Computationally Bounded Channels
Relaxed Locally Correctable Codes in Computationally Bounded Channels Elena Grigorescu (Purdue) Joint with Jeremiah Blocki (Purdue), Venkata Gandikota (JHU), Samson Zhou (Purdue) Classical Locally Decodable/Correctable
More informationLDPC Codes. Slides originally from I. Land p.1
Slides originally from I. Land p.1 LDPC Codes Definition of LDPC Codes Factor Graphs to use in decoding Decoding for binary erasure channels EXIT charts Soft-Output Decoding Turbo principle applied to
More informationLast time, we described a pseudorandom generator that stretched its truly random input by one. If f is ( 1 2
CMPT 881: Pseudorandomness Prof. Valentine Kabanets Lecture 20: N W Pseudorandom Generator November 25, 2004 Scribe: Ladan A. Mahabadi 1 Introduction In this last lecture of the course, we ll discuss the
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationLecture 21: P vs BPP 2
Advanced Complexity Theory Spring 206 Prof. Dana Moshkovitz Lecture 2: P vs BPP 2 Overview In the previous lecture, we began our discussion of pseudorandomness. We presented the Blum- Micali definition
More informationLecture 9 Polar Coding
Lecture 9 Polar Coding I-Hsiang ang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 29, 2015 1 / 25 I-Hsiang ang IT Lecture 9 In Pursuit of Shannon s Limit Since
More informationNotes for the Hong Kong Lectures on Algorithmic Coding Theory. Luca Trevisan. January 7, 2007
Notes for the Hong Kong Lectures on Algorithmic Coding Theory Luca Trevisan January 7, 2007 These notes are excerpted from the paper Some Applications of Coding Theory in Computational Complexity [Tre04].
More informationLocality in Coding Theory
Locality in Coding Theory Madhu Sudan Harvard April 9, 2016 Skoltech: Locality in Coding Theory 1 Error-Correcting Codes (Linear) Code CC FF qq nn. FF qq : Finite field with qq elements. nn block length
More informationDecoding Reed-Muller codes over product sets
Rutgers University May 30, 2016 Overview Error-correcting codes 1 Error-correcting codes Motivation 2 Reed-Solomon codes Reed-Muller codes 3 Error-correcting codes Motivation Goal: Send a message Don t
More informationLecture 5, CPA Secure Encryption from PRFs
CS 4501-6501 Topics in Cryptography 16 Feb 2018 Lecture 5, CPA Secure Encryption from PRFs Lecturer: Mohammad Mahmoody Scribe: J. Fu, D. Anderson, W. Chao, and Y. Yu 1 Review Ralling: CPA Security and
More informationTHIS work is motivated by the goal of finding the capacity
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 8, AUGUST 2007 2693 Improved Lower Bounds for the Capacity of i.i.d. Deletion Duplication Channels Eleni Drinea, Member, IEEE, Michael Mitzenmacher,
More informationConcatenated Polar Codes
Concatenated Polar Codes Mayank Bakshi 1, Sidharth Jaggi 2, Michelle Effros 1 1 California Institute of Technology 2 Chinese University of Hong Kong arxiv:1001.2545v1 [cs.it] 14 Jan 2010 Abstract Polar
More informationLocally Decodable Codes
Foundations and Trends R in sample Vol. xx, No xx (xxxx) 1 114 c xxxx xxxxxxxxx DOI: xxxxxx Locally Decodable Codes Sergey Yekhanin 1 1 Microsoft Research Silicon Valley, 1065 La Avenida, Mountain View,
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution 1. Polynomial intersections Find (and prove) an upper-bound on the number of times two distinct degree
More informationAn Introduction to Algorithmic Coding Theory
An Introduction to Algorithmic Coding Theory M. Amin Shokrollahi Bell Laboratories Part : Codes - A puzzle What do the following problems have in common? 2 Problem : Information Transmission MESSAGE G
More informationNew constructions of WOM codes using the Wozencraft ensemble
New constructions of WOM codes using the Wozencraft ensemble Amir Shpilka Abstract In this paper we give several new constructions of WOM codes. The novelty in our constructions is the use of the so called
More informationExplicit Capacity-Achieving List-Decodable Codes or Decoding Folded Reed-Solomon Codes up to their Distance
Explicit Capacity-Achieving List-Decodable Codes or Decoding Folded Reed-Solomon Codes up to their Distance Venkatesan Guruswami Atri Rudra Department of Computer Science and Engineering University of
More informationOptimal Error Rates for Interactive Coding II: Efficiency and List Decoding
Optimal Error Rates for Interactive Coding II: Efficiency and List Decoding Mohsen Ghaffari MIT ghaffari@mit.edu Bernhard Haeupler Microsoft Research haeupler@cs.cmu.edu Abstract We study coding schemes
More informationDecoding of Interleaved Reed Solomon Codes over Noisy Data
Decoding of Interleaved Reed Solomon Codes over Noisy Data Daniel Bleichenbacher 1, Aggelos Kiayias 2, and Moti Yung 3 1 Bell Laboratories, Murray Hill, NJ, USA bleichen@bell-labscom 2 Department of Computer
More informationLinear time list recovery via expander codes
Linear time list recovery via expander codes Brett Hemenway and Mary Wootters June 7 26 Outline Introduction List recovery Expander codes List recovery of expander codes Conclusion Our Results One slide
More informationLinear-algebraic list decoding for variants of Reed-Solomon codes
Electronic Colloquium on Computational Complexity, Report No. 73 (2012) Linear-algebraic list decoding for variants of Reed-Solomon codes VENKATESAN GURUSWAMI CAROL WANG Computer Science Department Carnegie
More informationarxiv: v1 [cs.it] 22 Jul 2015
Efficient Low-Redundancy Codes for Correcting Multiple Deletions Joshua Brakensiek Venkatesan Guruswami Samuel Zbarsky arxiv:1507.06175v1 [cs.it] 22 Jul 2015 Carnegie Mellon University Abstract We consider
More informationEE 229B ERROR CONTROL CODING Spring 2005
EE 229B ERROR CONTROL CODING Spring 2005 Solutions for Homework 1 1. Is there room? Prove or disprove : There is a (12,7) binary linear code with d min = 5. If there were a (12,7) binary linear code with
More informationPhysical Layer and Coding
Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:
More informationCSCI-B609: A Theorist s Toolkit, Fall 2016 Oct 4. Theorem 1. A non-zero, univariate polynomial with degree d has at most d roots.
CSCI-B609: A Theorist s Toolkit, Fall 2016 Oct 4 Lecture 14: Schwartz-Zippel Lemma and Intro to Coding Theory Lecturer: Yuan Zhou Scribe: Haoyu Zhang 1 Roots of polynomials Theorem 1. A non-zero, univariate
More informationBlock Codes :Algorithms in the Real World
Block Codes 5-853:Algorithms in the Real World Error Correcting Codes II Reed-Solomon Codes Concatenated Codes Overview of some topics in coding Low Density Parity Check Codes (aka Expander Codes) -Network
More information1 Alphabets and Languages
1 Alphabets and Languages Look at handout 1 (inference rules for sets) and use the rules on some examples like {a} {{a}} {a} {a, b}, {a} {{a}}, {a} {{a}}, {a} {a, b}, a {{a}}, a {a, b}, a {{a}}, a {a,
More informationLocally Decodable Codes
Foundations and Trends R in sample Vol. xx, No xx (xxxx) 1 98 c xxxx xxxxxxxxx DOI: xxxxxx Locally Decodable Codes Sergey Yekhanin 1 1 Microsoft Research Silicon Valley, 1065 La Avenida, Mountain View,
More informationNew Directions in Coding Theory: Capacity and Limitations
New Directions in Coding Theory: Capacity and Limitations Ameya A. Velingker CMU-CS-16-122 August 2016 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Thesis Committee: Venkatesan
More informationNew constructions of WOM codes using the Wozencraft ensemble
New constructions of WOM codes using the Wozencraft ensemble Amir Shpilka Abstract In this paper we give several new constructions of WOM codes. The novelty in our constructions is the use of the so called
More informationLecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem
Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y
More informationBelief propagation decoding of quantum channels by passing quantum messages
Belief propagation decoding of quantum channels by passing quantum messages arxiv:67.4833 QIP 27 Joseph M. Renes lempelziv@flickr To do research in quantum information theory, pick a favorite text on classical
More informationRegenerating Codes and Locally Recoverable. Codes for Distributed Storage Systems
Regenerating Codes and Locally Recoverable 1 Codes for Distributed Storage Systems Yongjune Kim and Yaoqing Yang Abstract We survey the recent results on applying error control coding to distributed storage
More informationOn the NP-Hardness of Bounded Distance Decoding of Reed-Solomon Codes
On the NP-Hardness of Bounded Distance Decoding of Reed-Solomon Codes Venkata Gandikota Purdue University vgandiko@purdue.edu Badih Ghazi MIT badih@mit.edu Elena Grigorescu Purdue University elena-g@purdue.edu
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationNotes 10: Public-key cryptography
MTH6115 Cryptography Notes 10: Public-key cryptography In this section we look at two other schemes that have been proposed for publickey ciphers. The first is interesting because it was the earliest such
More informationA list-decodable code with local encoding and decoding
A list-decodable code with local encoding and decoding Marius Zimand Towson University Department of Computer and Information Sciences Baltimore, MD http://triton.towson.edu/ mzimand Abstract For arbitrary
More informationRaptor Codes: From a Math Idea to LTE embms. BIRS, October 2015
Raptor Codes: From a Math Idea to LTE embms BIRS, October 2015 The plan is to... 1 introduce LT codes and Raptor codes 2 provide insights into their design 3 address some common misconceptions 2 / 31 The
More informationNetwork Coding and Schubert Varieties over Finite Fields
Network Coding and Schubert Varieties over Finite Fields Anna-Lena Horlemann-Trautmann Algorithmics Laboratory, EPFL, Schweiz October 12th, 2016 University of Kentucky What is this talk about? 1 / 31 Overview
More informationTutorial: Locally decodable codes. UT Austin
Tutorial: Locally decodable codes Anna Gál UT Austin Locally decodable codes Error correcting codes with extra property: Recover (any) one message bit, by reading only a small number of codeword bits.
More informationLecture 7 September 24
EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling
More informationExercises with solutions (Set B)
Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th
More informationAn Introduction to Low Density Parity Check (LDPC) Codes
An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,
More informationError-correcting codes and applications
Error-correcting codes and applications November 20, 2017 Summary and notation Consider F q : a finite field (if q = 2, then F q are the binary numbers), V = V(F q,n): a vector space over F q of dimension
More informationChapter 7 Reed Solomon Codes and Binary Transmission
Chapter 7 Reed Solomon Codes and Binary Transmission 7.1 Introduction Reed Solomon codes named after Reed and Solomon [9] following their publication in 1960 have been used together with hard decision
More informationEE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes
EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More information