1 / 28

Sublinear-Time Error-Correction and Error-Detection

Sublinear-Time Error-Correction and Error-Detection. Luca Trevisan U.C. Berkeley luca@eecs.berkeley.edu. Contents. Survey of results on error-correcting codes with sub-linear time checking and decoding procedures Results originated in complexity theory. Error-correction. Error-detection.

ronaldyoung
Télécharger la présentation

Sublinear-Time Error-Correction and Error-Detection

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sublinear-Time Error-Correction and Error-Detection Luca Trevisan U.C. Berkeley luca@eecs.berkeley.edu

  2. Contents • Survey of results on error-correcting codes with sub-linear time checking and decoding procedures • Results originated in complexity theory

  3. Error-correction

  4. Error-detection

  5. Minimum Distance

  6. Ideally • Constant information rate • Linear minimum distance • Very efficient decoding Sipser-Spielman: linear time deterministic procedure

  7. Sub-linear time decoding? • Must be probabilistic • Must have some probability of incorrect decoding • Even so, is it possible?

  8. Motivations & Context • Sub-linear time decoding useful for worst-case to average-case reductions, and in information-theoretic Private Information Retrieval • Sub-linear time checking arises in PCP • Useful in practice?

  9. Error-correction

  10. Hadamard Code

  11. Example Encoding of… is…

  12. “Constant time” decoding

  13. Analysis

  14. A Lower Bound • If: the code is linear, the alphabet is small, and the decoding procedure uses two queries • Then exponential encoding length is necessary Goldreich-Trevisan, Samorodnitsky

  15. More trade-offs • For k queries and binary alphabet: • More complicated formulas for bigger alphabet

  16. Construction without polynomials

  17. Construction with polynomials • View message as polynomial p:Fk->F of degree d (F is a field, |F| >> d) • Encode message by evaluating p at all |F|k points • To encode n-bits message, can have |F| polynomial in n, and d,k around (log n)O(1)

  18. To reconstruct p(x) • Pick a random line in Fkpassing through x; • evaluate p on d+1 points of the line; • by interpolation, find degree-d univariate polynomial that agrees with p on the line • Use interp’ing polynomial to estimate p(x) • Algorithm reads p in d+1 points, each uniformly distributed Beaver-Feigenbaum; Lipton; Gemmel-Lipton-Rubinfeld-Sudan-Wigderson

  19. x+(d+1)y x+2y x+y x

  20. Error-detection

  21. Checking polynomial codes • Consider encoding with multivariate low-degree polynomials • Given p, pick random z, do the decoding for p(z), compare with actual value of p(z) • “Simple” case of low-degree test. • Rejection prob. proportional to distance from code. Rubinfeld-Sudan

  22. Bivariate Code • A degree-d bivariate polynomial p:F x F -> F can be represented as 2|F| univariate degree-d polynomials (the “rows” and the columns”) 2x2 + xy + y2 + 1 mod 5

  23. Bivariate Low-Degree Test • Pick a random row and a random column. Chek that they agree on intersection • If |F| is a constant factor bigger than d, then rejection probability is proportional to distance from code Arora-Safra, ALMSS, Polishuck-Spielman

  24. Efficiency of Decoding vs Checking

  25. Tensor Product Codes • Suppose we have a linear code C with codewords in {0,1}^m. • Define new code C’ with codewords in {0,1}^(mxm); • a “matrix” is a codeword of C’ if each row and each column is codeword for C • If C has lots of codeword and large minimum distance, same true for C’

  26. Generalization of the Bivariate Low Degree Test • Suppose C has K codewords • Define code C’’ over alphabet [K], with codewords of length 2m • C’’ has as many codewords as C’ • For each codeword yof C’, corresponding codeword in C’’ contains value of each row and each column of y • Test: pick a random “row” and a random “column”, check intersection agrees • Analysis?

  27. Negative Results? • No known lower bound for locally checkable codes • Possible to get encoding length n^(1+o(1)) and checking with O(1) queries and {0,1} alphabet? • Possible to get encoding length O(n) with O(1) queries and small alphabet?

  28. Applications? • Better locally decodable codes have applications to PIR • General/simple analysis of checkable proofs could have application to PCP (linear-length PCP, simple proof of the PCP theorem) • Applications to the practice of fault-tolerant data storage/transmission?

More Related