Local correctability of expander codes

Similar documents
Linear time list recovery via expander codes

Tutorial: Locally decodable codes. UT Austin

Lecture 6: Expander Codes

Locally Decodable Codes

Error Correcting Codes Questions Pool

Locally Decodable Codes

Lecture 4: Codes based on Concatenation

Lecture 3: Error Correcting Codes

List and local error-correction

Efficiently decodable codes for the binary deletion channel

Algebraic Geometry Codes. Shelly Manber. Linear Codes. Algebraic Geometry Codes. Example: Hermitian. Shelly Manber. Codes. Decoding.

6.895 PCP and Hardness of Approximation MIT, Fall Lecture 3: Coding Theory

arxiv: v1 [cs.cr] 16 Dec 2014

CSCI-B609: A Theorist s Toolkit, Fall 2016 Oct 4. Theorem 1. A non-zero, univariate polynomial with degree d has at most d roots.

Reed-Muller testing and approximating small set expansion & hypergraph coloring

High-Rate Codes with Sublinear-Time Decoding

Decoding Reed-Muller codes over product sets

Lecture 03: Polynomial Based Codes

Report on PIR with Low Storage Overhead

CS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding

Graph-based codes for flash memory

High-rate codes with sublinear-time decoding

Lecture 4: Linear Codes. Copyright G. Caire 88

Block Codes :Algorithms in the Real World

Locality in Coding Theory

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)

Lecture 8: Shannon s Noise Models

List-decoding algorithms for lifted codes

Three Query Locally Decodable Codes with Higher Correctness Require Exponential Length

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class

Locally testable and Locally correctable Codes Approaching the Gilbert-Varshamov Bound

Two Query PCP with Sub-Constant Error

Ma/CS 6b Class 25: Error Correcting Codes 2

Lecture 12: November 6, 2017

Lecture Introduction. 2 Formal Definition. CS CTT Current Topics in Theoretical CS Oct 30, 2012

Recognition of a code in a noisy environment

Low-Density Parity-Check Codes

Lecture 15 and 16: BCH Codes: Error Correction

MATH Examination for the Module MATH-3152 (May 2009) Coding Theory. Time allowed: 2 hours. S = q

Algebraic Codes and Invariance

: Error Correcting Codes. October 2017 Lecture 1

Lecture 7 September 24

2. Polynomials. 19 points. 3/3/3/3/3/4 Clearly indicate your correctly formatted answer: this is what is to be graded. No need to justify!

List Decoding of Noisy Reed-Muller-like Codes

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Error-Correcting Codes:

Applications of Galois Geometries to Coding Theory and Cryptography

for some error exponent E( R) as a function R,

Lecture 15 April 9, 2007

Improved decoding of Folded Reed-Solomon and Multiplicity Codes

Lecture 7: Passive Learning

Cyclic Redundancy Check Codes

Relaxed Locally Correctable Codes in Computationally Bounded Channels

CS151 Complexity Theory. Lecture 9 May 1, 2017

The extended coset leader weight enumerator

Error-correcting codes and applications

Near-Optimal Secret Sharing and Error Correcting Codes in AC 0

A Quadratic Lower Bound for Three-Query Linear Locally Decodable Codes over Any Field

Error-Correcting Codes

A Key Recovery Attack on MDPC with CCA Security Using Decoding Errors

Arrangements, matroids and codes

MATH32031: Coding Theory Part 15: Summary

Questions Pool. Amnon Ta-Shma and Dean Doron. January 2, Make sure you know how to solve. Do not submit.

Computational Complexity: A Modern Approach

Guess & Check Codes for Deletions, Insertions, and Synchronization

Linear-algebraic pseudorandomness: Subspace Designs & Dimension Expanders

Compressed Sensing and Linear Codes over Real Numbers

Notes for the Hong Kong Lectures on Algorithmic Coding Theory. Luca Trevisan. January 7, 2007

Lecture 9: List decoding Reed-Solomon and Folded Reed-Solomon codes

Low Rate Is Insufficient for Local Testability

EE 229B ERROR CONTROL CODING Spring 2005

List Decoding of Reed Solomon Codes

Lecture 2 Linear Codes

Reconstruction for Models on Random Graphs

The Hamming Codes and Delsarte s Linear Programming Bound

Notes 10: List Decoding Reed-Solomon Codes and Concatenated codes

Lecture 21: P vs BPP 2

Maximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback

An Introduction to (Network) Coding Theory

Lecture notes on coding theory

MATH3302. Coding and Cryptography. Coding Theory

Lecture 19 : Reed-Muller, Concatenation Codes & Decoding problem

Efficient Probabilistically Checkable Debates

Raptor Codes: From a Math Idea to LTE embms. BIRS, October 2015

A list-decodable code with local encoding and decoding

3. Coding theory 3.1. Basic concepts

Linear Algebra. F n = {all vectors of dimension n over field F} Linear algebra is about vectors. Concretely, vectors look like this:

CSCI 2570 Introduction to Nanocomputing

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

1 The Low-Degree Testing Assumption

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution

(x 1 +x 2 )(x 1 x 2 )+(x 2 +x 3 )(x 2 x 3 )+(x 3 +x 1 )(x 3 x 1 ).

Hardness Amplification

Communications II Lecture 9: Error Correction Coding. Professor Kin K. Leung EEE and Computing Departments Imperial College London Copyright reserved

Error Detection and Correction: Hamming Code; Reed-Muller Code

PCP Theorem and Hardness of Approximation

Berlekamp-Massey decoding of RS code

An Introduction to (Network) Coding Theory

Reed-Muller Codes. Sebastian Raaphorst Carleton University

Transcription:

Local correctability of expander codes Brett Hemenway Rafail Ostrovsky Mary Wootters IAS April 4, 24

The point(s) of this talk Locally decodable codes are codes which admit sublinear time decoding of small pieces of a message. Expander codes are a family of error correcting codes based on expander graphs. In this work, we show that (appropriately instantiated) expander codes are high-rate locally decodable codes. Only two families of codes known in this regime [KSY,GKS 2]. Expander codes (and the corresponding decoding algorithm and analysis) are very different from existing constructions!

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Error correcting codes Noisy channel Alice Bob

Error correcting codes message x Σ k Noisy channel Alice Bob

Error correcting codes message x Σ k codeword C(x) Σ N Noisy channel Alice Bob

Error correcting codes message x Σ k corrupted codeword w Σ N codeword C(x) Σ N Noisy channel Alice Bob

Error correcting codes message x Σ k corrupted codeword w Σ N codeword C(x) Σ N x? Noisy channel Alice Bob

Locally decodable codes message x Σ k corrupted codeword w Σ N codeword C(x) Σ N x? Noisy channel Alice Bob

Locally decodable codes message x Σ k corrupted codeword w Σ N codeword C(x) Σ N x i? Noisy channel Alice Bob

Locally decodable codes message x Σ k corrupted codeword w Σ N codeword C(x) Σ N Bob makes only q queries x i? Noisy channel Alice Bob

Locally correctable codes message x Σ k corrupted codeword w Σ N codeword C(x) Σ N Bob makes only q queries C(x) i? Noisy channel Alice Bob

Locally correctable codes, sans stick figures Definition C is (q, δ, η)-locally correctable if for all i [N], for all x Σ k, and for all w Σ N with d(w, C(x)) δn, P {Bob correctly guesses C(x) i } η. Bob reads only q positions in the corrupted word, w.

Local correctability vs. local decodability When C is linear, local correctability implies local decodability. G x = C(x)

Local correctability vs. local decodability When C is linear, local correctability implies local decodability... x. G x = C(x)

Before we get too far Some notation For a code C : Σ k Σ N The message length is k, the length of the message. The block length is N, the length of the codeword. The rate is k/n. The locality is q, the number of queries Bob makes. Goal: large rate, small locality.

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Example: Reed-Muller Codes f (x, y) (f (, ), f (, ), f (, ), f (, )) Alice Message: multivariate polynomial of total degree d, f F q [z,..., z m ]. Codeword: the evaluation of f at points in F m q : C(f ) = {f ( x)} x F m q

Locally Correcting Reed Muller Codes Points in F m q message is f F q [z,..., z m ] codeword is {f ( x)} x F m q

Locally Correcting Reed Muller Codes Points in F m q We want to correct C(f ) z = f ( z). f ( z) message is f F q [z,..., z m ] codeword is {f ( x)} x F m q

Locally Correcting Reed Muller Codes Points in F m q f ( z) f ( v) We want to correct C(f ) z = f ( z). Choose a random line through z, and consider the restriction to that line. g(t) = f ( z + t v) message is f F q [z,..., z m ] codeword is {f ( x)} x F m q

Locally Correcting Reed Muller Codes Points in F m q f ( z) f ( v) message is f F q [z,..., z m ] We want to correct C(f ) z = f ( z). Choose a random line through z, and consider the restriction to that line. g(t) = f ( z + t v) This is a univariate polynomial, and g() = f ( z). codeword is {f ( x)} x F m q

Locally Correcting Reed Muller Codes Points in F m q f ( z) f ( v) message is f F q [z,..., z m ] codeword is {f ( x)} x F m q We want to correct C(f ) z = f ( z). Choose a random line through z, and consider the restriction to that line. g(t) = f ( z + t v) This is a univariate polynomial, and g() = f ( z). Query all of the points on the line.

Resulting parameters Rate is ( m+d m )/q m (we needed d = O(q), so we can decode) Locality is q (the field size) If we choose m constant, we get: Rate is constant, but less than /2. Locality is N /m = N ε.

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Question: Reed-Muller Codes have locality N ε and constant rate, but rate is less than /2.

Question: Reed-Muller Codes have locality N ε and constant rate, but rate is less than /2. Are there locally decodable codes with locality N ε, and rate arbitrarily close to?

Previous Work Rate and locality N ε : Multiplicity codes [Kopparty, Saraf, Yekhanin 2] Lifted codes [Guo, Kopparty, Sudan 22]

Previous Work Rate and locality N ε : Multiplicity codes [Kopparty, Saraf, Yekhanin 2] Lifted codes [Guo, Kopparty, Sudan 22] These have decoders similar to RM: the queries form a good code.

Previous Work Rate and locality N ε : Multiplicity codes [Kopparty, Saraf, Yekhanin 2] Lifted codes [Guo, Kopparty, Sudan 22] These have decoders similar to RM: the queries form a good code. Another regime: Rate bad but locality 3: ( N/2 2O( log(n)) ), Matching vector codes [Yekhanin 28, Efremenko 29,...] These decoders are different: The queries cannot tolerate any errors. There are so few queries that they are probably all correct.

Previous Work Rate and locality N ε : Multiplicity codes [Kopparty, Saraf, Yekhanin 2] Lifted codes [Guo, Kopparty, Sudan 22] These have decoders similar to RM: the queries form a good code. Expander codes [H., Ostrovsky, Wootters 23] Decoder is similar in spirit to lowquery decoders. The queries will not form an error correcting code. Another regime: Rate bad but locality 3: ( N/2 2O( log(n)) ), Matching vector codes [Yekhanin 28, Efremenko 29,...] These decoders are different: The queries cannot tolerate any errors. There are so few queries that they are probably all correct.

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Tanner Codes [Tanner 8] Given: A d-regular graph G with n vertices and N = nd 2 An inner code C with block length d over Σ edges We get a Tanner code C. C has block length N and alphabet Σ. Codewords are labelings of edges of G. A labeling is in C if the labels on each vertex form a codeword of C.

Example [Tanner 8] G is K 8, and C is the [7, 4, 3]-Hamming code. N = ( 8 2) = 28 and Σ = {, }

Example [Tanner 8] G is K 8, and C is the [7, 4, 3]-Hamming code. A codeword of C is a labeling of edges of G. red blue (,,,,,,,,,,,,,,,,,,,,,,,,,,, ) C {, } 28

Example [Tanner 8] G is K 8, and C is the [7, 4, 3]-Hamming code. These edges form a codeword in the Hamming code red blue (,,,,,,,,,,,,,,,,,,,,,,,,,,, ) C {, } 28

Encoding Tanner Codes Encoding is Easy!. Generate parity-check matrix Requires: Edge-vertex incidence matrix of graph Parity-check matrix of inner code 2. Calculate a basis for the kernel of the parity-check matrix 3. This basis defines a generator matrix for the linear Tanner Code 4. Encoding is just multiplication by this generator matrix

Linearity If the inner code C is linear, so is the Tanner code C C = Ker(H ) for some parity check matrix H. x C H x =

Linearity If the inner code C is linear, so is the Tanner code C C = Ker(H ) for some parity check matrix H. x C H x = So codewords of the Tanner code C also are defined by linear constraints: v y y C v G, H = y Γ(v)

Example: vertex edge incidence matrix of K 8 row for each vertex column for each edge Columns have weight 2 (Each edge hits two vertices) Rows have weight 7 (Each vertex has degree seven)

Example: parity-check matrix of a Tanner code K 8 and the [7, 4, 3]-Hamming code Parity-check of Hamming code Edge-vertex incidence matrix of K 8

Example: parity-check matrix of a Tanner code K 8 and the [7, 4, 3]-Hamming code Vertex

Example: parity-check matrix of a Tanner code K 8 and the [7, 4, 3]-Hamming code

Example: parity-check matrix of a Tanner code K 8 and the [7, 4, 3]-Hamming code

Example: parity-check matrix of a Tanner code K 8 and the [7, 4, 3]-Hamming code

Example: parity-check matrix of a Tanner code K 8 and the [7, 4, 3]-Hamming code

If the inner code has good rate, so does the outer code Say that C is linear If C has rate r, it satisfies ( r )d linear constraints. Each of the n vertices of G must satisfy these constraints.

If the inner code has good rate, so does the outer code Say that C is linear If C has rate r, it satisfies ( r )d linear constraints. Each of the n vertices of G must satisfy these constraints. C is defined by at most n ( r )d constraints.

If the inner code has good rate, so does the outer code Say that C is linear If C has rate r, it satisfies ( r )d linear constraints. Each of the n vertices of G must satisfy these constraints. C is defined by at most n ( r )d constraints. Length of C = N = # edges = nd/2

If the inner code has good rate, so does the outer code Say that C is linear If C has rate r, it satisfies ( r )d linear constraints. Each of the n vertices of G must satisfy these constraints. C is defined by at most n ( r )d constraints. Length of C = N = # edges = nd/2 The rate of C is R = k N N n ( r )d N = 2r.

Better rate bounds? The lower bound R > 2r is independent of the ordering of edges around a vertex Tanner already noticed that order matters. Let G be the complete bipartite graph with 7 vertices per side Let C be the [7, 4, 3] hamming code Then different natural orderings achieve a Tanner code with [49, 6, 9] ( 6 49.327) [49, 2, 6] ( 2 49.245) [49, 7, 7] ( 7 4 49.42) Meets lower bound of 2 7

Expander codes When the underlying graph is an expander graph, the Tanner code is a expander code. Expander codes admit very fast decoding algorithms [Sipser and Spielman 996] Further improvements in [Sipser 96, Zemor, Barg and Zemor 2, 5, 6]

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Main Result Given: a d-regular expander graph; an inner code of length d with smooth reconstruction. Then: We will give a local-correcting procedure for this expander code.

Smooth Reconstruction codeword c Σ N Bob makes q queries c i? Bob

Smooth Reconstruction codeword c Σ N Bob makes q queries Suppose that: Each Bob s q queries is (close to) uniformly distributed (they don t need to be independent!) c i? Bob

Smooth Reconstruction codeword c Σ N Bob makes q queries Suppose that: Each Bob s q queries is (close to) uniformly distributed (they don t need to be independent!) c i? Bob

Smooth Reconstruction codeword c Σ N Bob makes q queries Suppose that: Each Bob s q queries is (close to) uniformly distributed (they don t need to be independent!) c i? Bob

Smooth Reconstruction codeword c Σ N Bob makes q queries c i! Suppose that: Each Bob s q queries is (close to) uniformly distributed (they don t need to be independent!) From the (uncorrupted) queries, he can always recover c i. Bob

Smooth Reconstruction codeword c Σ N Bob makes q queries c i Suppose that: Each Bob s q queries is (close to) uniformly distributed (they don t need to be independent!) From the (uncorrupted) queries, he can always recover c i. But! He doesn t need to tolerate any errors. Bob

Smooth Reconstruction codeword c Σ N Bob Bob makes q queries c i Suppose that: Each Bob s q queries is (close to) uniformly distributed (they don t need to be independent!) From the (uncorrupted) queries, he can always recover c i. But! He doesn t need to tolerate any errors. Then: We say that the code has a smooth reconstruction algorithm.

Smooth reconstruction, sans stick figures Definition A code C Σ d has a q-query smooth reconstruction algorithm if, for all i [d] and for all codewords c C : Bob can always determine c i from a set of queries c i,..., c iq Each c ij is (close to) uniformly distributed in [d].

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Main Result Given: a d-regular expander graph; an inner code of length d with smooth reconstruction. Then: We will give a local-correcting procedure for this expander code.

Decoding algorithm: main idea

Decoding algorithm: main idea Want to correct the label on this edge For this diagram q = 2

Decoding algorithm: main idea Want to correct the label on this edge For this diagram q = 2

Decoding algorithm: main idea Want to correct the label on this edge For this diagram q = 2

Decoding algorithm: main idea Want to correct the label on this edge For this diagram q = 2

Decoding algorithm: main idea Want to correct the label on this edge For this diagram q = 2

The expander walk as a tree Want to correct the label on this edge O(log n) q-ary tree (inner code has q-query reconstruction)

The expander walk as a tree O(log n) True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. q-ary tree

The expander walk as a tree O(log n) True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! q-ary tree

The expander walk as a tree O(log n) True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: q-ary tree

The expander walk as a tree O(log n) q-ary tree True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: There are errors on the leaves.

The expander walk as a tree O(log n) q-ary tree True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: There are errors on the leaves.

The expander walk as a tree O(log n) q-ary tree True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: There are errors on the leaves. Errors on the leaves propagate.

The expander walk as a tree O(log n) q-ary tree True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: There are errors on the leaves. Errors on the leaves propagate.

The expander walk as a tree O(log n) q-ary tree True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: There are errors on the leaves. Errors on the leaves propagate.

The expander walk as a tree O(log n) q-ary tree True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: There are errors on the leaves. Errors on the leaves propagate.

The expander walk as a tree O(log n)! q-ary tree True Statements: The symbols on the leaves determine the symbol on the root. There are q O(log(n)) N ε leaves. The leaves are (nearly) uniformly distributed in G. Idea: Query the leaves! Problems: There are errors on the leaves. Errors on the leaves propagate.

Correcting the last layer

Correcting the last layer

Correcting the last layer O(log n) O(log n) Edges to get us to uniform locations in the graph (not read) Edge we want to learn (not read) Edges for error correction (read)

Why should this help? Now the queries can tolerate a few errors. O(log n) q-ary tree

Why should this help? False statement: Now the queries can tolerate a few errors. O(log n) q-ary tree

Why should this help?! False statement: Now the queries can tolerate a few errors. O(log n) q-ary tree

Why should this help? O(log n)! q-ary tree False statement: Now the queries can tolerate a few errors. True statements: This is basically the only thing that can go wrong. Because everything in sight is (nearly) uniform, it probably won t go wrong.

Decoding algorithm

Decoding algorithm Each leaf edge queries its symbol

Decoding algorithm If my correct value were, there would be some path below me with error. If my correct value were, there would be some path below me with errors. Each leaf edge thinks to itself...

Decoding algorithm Local corrector: (, ) (, ) (, ) (, ) Each second-level edge reads its symbol and thinks to itself...

Decoding algorithm Local corrector: (, ) (, ) (, ) (, ) If I were... path with two errors... Each second-level edge reads its symbol and thinks to itself...

Decoding algorithm Local corrector: (, ) (, ) (, ) (, ) If I were... or path with no errors. Each second-level edge reads its symbol and thinks to itself...

Decoding algorithm Local corrector: (, ) (, ) (, ) (, ) If I were... path with one error... Each second-level edge reads its symbol and thinks to itself...

Decoding algorithm Local corrector: (, ) (, ) (, ) (, ) If I were... or path with two errors. Each second-level edge reads its symbol and thinks to itself...

Decoding algorithm If my correct value were, there would be some path below me with 2 errors. If my correct value were, there would be some path below me with errors. Each second-level edge reads its symbol and thinks to itself...

Decoding algorithm If my correct value were, there would be some path below me with Ω(log(n)) errors. If my correct value were, there would be some path below me with 7 errors. etc.

Decoding algorithm If my correct value were, there would be some path below me with Ω(log(n)) errors. If my correct value were, there would be some path below me with 7 errors. TRIUMPHANTLY RETURN! This only fails if there exist a path that is heavily corrupted. Heavily corrupted paths occur with exponentially small probability.

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

One choice for inner code: based on affine geometry See [Assmus, Key 94, 98] for a nice overview Let L,..., L t be the r-dimensional affine subspaces of F m q, and consider the code with parity-check matrix H: x F m q t L i H H i, x = { x L i x L i q m

One choice for inner code: based on affine geometry See [Assmus, Key 94, 98] for a nice overview Let L,..., L t be the r-dimensional affine subspaces of F m q, and consider the code with parity-check matrix H: query the q r nonzeros in this row x F m q L { i x L i t H i, x = x L H i q m Smooth reconstruction: To learn a coordinate indexed by x F m q : pick a random r-flat, Li, containing x. query all of the points in Li.

One choice for inner code: based on affine geometry See [Assmus, Key 94, 98] for a nice overview Let L,..., L t be the r-dimensional affine subspaces of F m q, and consider the code with parity-check matrix H: query the q r nonzeros in this row x F m q L { i x L i t H i, x = x L H i q m Smooth reconstruction: To learn a coordinate indexed by x F m q : pick a random r-flat, Li, containing x. query all of the points in Li. Observe: This is not a very good LCC!

One good instantiation Graph: Ramanujan graph Inner code: Finite geometry code Results: For any α, ɛ >, for infinitely many N, we get a code with block length N, which has rate α has locality (N/d) ɛ tolerates constant error rate

Outline Local correctability Definitions and notation Example: Reed-Muller codes Previous work and our contribution 2 Expander codes 3 Local correctability of expander codes Requirement for the inner code: smooth reconstruction Decoding algorithm Example instantiation: finite geometry codes 4 Conclusion

Summary When the inner code has smooth reconstruction, we give a local-decoding procedure for expander codes. This gives a new (and yet old!) family of linear locally correctable codes of rate approaching.

Open questions Can we use expander codes to achieve local correctability with lower query complexity? Can we use inner codes with rate < /2?

The end Thanks!