1 / 55

Parikshit Gopalan Microsoft Adam R. Klivans UT Austin David Zuckerman UT Austin

List-Decoding Reed-Muller Codes over Small Fields. Parikshit Gopalan Microsoft Adam R. Klivans UT Austin David Zuckerman UT Austin. 1. 0. 0. 1. 1. 0. 0. 1. Error Correcting Codes. Communication over a Noisy Channel:. Adversary corrupts 10% of the bits.

nikki
Télécharger la présentation

Parikshit Gopalan Microsoft Adam R. Klivans UT Austin David Zuckerman UT Austin

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. List-Decoding Reed-Muller Codes over Small Fields Parikshit GopalanMicrosoft Adam R. KlivansUT AustinDavid ZuckermanUT Austin 1 0 0 1 1 0 0 1

  2. Error Correcting Codes Communication over a Noisy Channel: Adversary corrupts 10% of the bits. Problem:Recover the (entire) message. Soln:Introduce redundancy.

  3. Error-Correcting Codes Deep-space communication Internet Cellphones Satellite Broadcast Audio CDs Bar-codes

  4. Codes from Polynomials Encoding:Alice wants to send (a,b). Let L(x) = ax +b. Send L(1), L(2), …, L(7).

  5. Codes from Polynomials Adversary: Corrupts two values. Decoding: Find the (unique) line that passes through 5 points.

  6. Codes from Polynomials • Low-degree polynomials differ in many places. • Relative distance: Hamming distance/length • min distance:min {C,C’)|codewords C,C’}

  7. Codes from Polynomials • Low-degree polynomials differ in many places. • Relative distance: Hamming distance/length • min distance:min {C,C’)|codewords C,C’} • Reed-Solomon codes: Univariate polynomials. • Reed-Muller codes: Multivariate polynomials.

  8. Reed-Muller Codes [Muller’54, Reed’54] • Messages: Polynomials of degree r in m variables over {0,1}. • Q(X1,X2,X3) = X1X2 + X3 • Encoding: Truth table. • 01010110 • Minimum distance: = 2-r. • Hadamard codes:r =1. 1 0 0 1 1 0 0 1

  9. Reed-Muller Codes [Muller’54, Reed’54] • Messages: Polynomials of degree r in m variables over {0,1}. • Q(X1,X2,X3) = X1X2 + X3 • Encoding: Truth table. • 01010110 • Minimum distance: = 2-r. • Hadamard codes:r =1. 1 0 0 1 1 0 0 1

  10. Reed-Muller Codes [Muller’54, Reed’54] • Messages: Polynomials of degree r in m variables over {0,1}. • Q(X1,X2,X3) = X1X2 + X3 • Encoding: Truth table. • 01010110 • Minimum distance: = 2-r. • Hadamard codes:r =1. 0 1 0 1 0 0 1 1

  11. Decoding ´ Polynomial Reconstruction 0 1 0 1 0 1 0 1 1 0 0 0 0 1 1 1 Problem:Given data points, find a low degree polynomial that fits best. Well studied problem, numerous applications.

  12. The Decoding Problem C’  C • Received work R:{0,1}m {0,1}. • Unique Decoding: Find C such that (R,C) < /2. • List Decoding: [Elias’57, Wozencraft’58] • Find all C such that (R,C) < . • Few such C. • Johnson bound: List is small up to J() where • J() = (1-√(1-2))/2 = /2 + 2/2 + ... < 

  13. The Computational Model • Global Decoding: R 0 1 0 0 0 1 1 0 1 1 0 1 1 1 0 1 0 0 0 0 1 Given R as input. Run time poly in n = 2m. Local Decoding: Given an oracle for R. Run time poly in m = log n. x R(x) R

  14. Decoding Reed-Muller codes • Unique Decoding: Majority Logic Decoder. [Reed’54] • Local List Decoding: Hadamard codes (r = 1).[Goldreich-Levin’89] Alternate algorithms: [ Levin, Rackoff, Kushilevitz-Mansour, …] No algorithms known forr ¸ 2. Good algorithms for large fields (r < |F|).[Goldreich-Rubinfeld-Sudan, Arora-Sudan, Sudan-Trevisan-Vadhan]

  15. Our Results • Main Result:Local List-Decoding RM codes forr ¸ 2. Works up to Minimum Distance 2-r - . • Returns list of size -O(r) in time poly(mr, -r). • Improves Majority Logic Decoding for r ¸ 2. • Generalizes [Goldreich-Levin’89]. • Beats the Johnson bound. • For r =2, 0.146 versus 0.25. • List-size becomes exponential at 2-r.

  16. Our Results • Global List-Decoding: • Deterministic algorithm for r ¸ 2. • Works up to distance J(2.2-r) - . • Beyond the minimum distance. • For r =2, ½ - versus ¼ - . • Returns list of size -O(m) in time poly(-O(m)). • Brute force needs time O(2mr). • New combinatorial bound.

  17. Local List-Decoding • {0,1}mlabeled byreceived word R. • Fix codeword Q so that (Q,R) <  - .

  18. Local List-Decoding • {0,1}nlabeled byreceived word R. • Fix codeword Q so that (Q,R) <  - . R(x)  Q(x) R(x) = Q(x)

  19. A Self-Corrector [Goldreich-Levin] b • Goal: Find Q(b) whp. • Pick a small subspace A randomly. • Assume we know Q on A.

  20. A Self-Corrector [Goldreich-Levin] b • Goal: Find Q(b) whp. • Pick a small subspace A randomly. • Assume we know Q on A.

  21. A Self-Corrector [Goldreich-Levin] b • Goal: Find Q(b) whp. • Pick Arandomly. • We know Q on A. • Error onb + A < (very likely).

  22. A Self-Corrector [Goldreich-Levin] • Goal: Find Q(b) whp. • Pick Arandomly. • We know Q on A. • Error onb + A < (very likely). • Error on combined subspace< /2.

  23. A Self-Corrector [Goldreich-Levin] • Goal: Find Q(b) whp. • Pick Arandomly. • We know Q on A. • Error onb + A < (very likely). • Error on combined subspace< /2. • Unique Decode!

  24. Interpolating Sets • Qof degreer efficiently computable from Q(b), b  B=B(r). • r=1:0, e1, e2,…, em. • General r: all b of weight r. • Pick one random A. Use A to self-correct all b in interpolating set B. • Union bound  whp correct on all of B. • Can improve via Noisy Interpolating Sets[Dvir,Shpilka].

  25. advice Self-Corrector Interpolator Overall Algorithm R:{0,1}m! {0,1}

  26. Generating our own Advice • Advice: Q restricted to A. • A could have dimension log m. • Only m choices for r =1. • Too many choices when r ¸ 2. dim(A) = log(1/) = 1/poly(m)

  27. Generating our own Advice • Advice: Q restricted to A. • A could have dimension k=log m. • Error on A is <, whp. • Decode on A in time poly(2k).

  28. Global List-Decoding: Case r=2 Problem:GivenR: {0,1}k {0,1}, find allQof degree2so that(Q,R) < ¼. Run time polynomial in block-length 2k.

  29. Global List-Decoding: Case r=2 • Problem:GivenR: {0,1}k {0,1}, find allQof degree2so that(Q,R) < . • l():Worst case list-size. • Algorithm runs in time poly(2k,l()). • Works for all. • Does not imply bounds on list-size.

  30. Global List-Decoding: Case r=2 Problem:GivenR: {0,1}k {0,1}, find allQso that(Q,R) < . = ½(0 + 1). Let01. So0, 1 2. Q0 + L 1 Xk = 1 Q0 Q=Q0(X1,…,Xk-1) + XkL(X1,…,Xk-1) 0 Xk = 0 • RecoverQ0fromXk = 0. (degree2, error). • RecoverL fromXk = 1. (degree1, error2).

  31. Global List-Decoding: Case r=2 Problem:GivenR: {0,1}k {0,1}, find allQso that(Q,R) < . = ½(0 + 1). Let01. Don’t know whether01. Try all possibilities. Overhead is 2k . Q0 + L 1 Xk = 1 Q0 0 Xk = 0

  32. Bounds on List-Size Problem:GivenR: {0,1}k {0,1}, bound number of quadratic polys.Qs.t.(Q,R) < 1/4. Goal: Bound of2O(k). Johnson bound: 2O(k) for distance J(¼) = 0.156. Can we improve the distance ofRM(2,k) ?

  33. Analogy: Inter-Star Distance Proxima Centauri:4.2 light-years.

  34. Inter-Star Distance? Within 100,000 light-years µMilky Way.

  35. Intergalactic Distance Andromeda: 2.5 million light years away.

  36. Inter-Star Distance? Local Group of Galaxies, Local Supercluster, …

  37. Bounds on List-Size Problem:GivenR: {0,1}k {0,1}, bound number of quadratic polys.Qs.t.(Q,R) < 1/4. Goal: Bound of2O(k). Johnson bound: 2O(k) for distance J(¼) = 0.156. Can we improve the distance ofRM(2,k) ? Yes, for a 2-O(k)-dense subset of RM(2,k). Thm:Every quadratic form can be written asQ = L1L2 + …L2t-1L2t + L0 where Lis are LI and1 · t · k/2. Rank of Q

  38. Rank versus Weight Rank 2 forms. Weight 0.375. J(0.375) = ¼. Rank 1 forms. Only 22k. Thm: List-size is 2O(k) at distance ¼.

  39. Bounding the List-size R

  40. Bounding the List-size R

  41. Bounding the List-size R

  42. Bounding the List-size R

  43. Bounding the List-size R

  44. Bounding the List-size R

  45. Bounding the List-size R Each remaining pair at dist.0.375. List-size2kby Johnson bound.

  46. Bounding the List-size R

  47. Bounding the List-Size. • 2kballs by Johnson bound. • Each ball contains at most22kcodewords. • Overall at most 23kcodewords at radius ¼. We need k = O(log m) for local decoding.

  48. advice Self-Corrector Interpolator Overall Local List-Decoder R:{0,1}m! {0,1}

  49. Self-Corrector Interpolator Overall Local List-Decoder R:{0,1}m! {0,1} Global List-Decoder

  50. Extension to Higher Degree • No analogue of rank. • [Kasami-Tokura]: Characterizes codewords with weight · 21-r. • List-decoding up to radius2-r -  inpoly(m, -1).

More Related