1 / 25

Error-Correcting Codes and Pseudorandom Projections

Error-Correcting Codes and Pseudorandom Projections. Luca Trevisan U.C. Berkeley. About this talk. Take the input, encode with an error-correcting code, and restrict the codeword to a (pseudo)randomly chosen subset of the bits

naiara
Télécharger la présentation

Error-Correcting Codes and Pseudorandom Projections

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Error-Correcting Codes and Pseudorandom Projections Luca Trevisan U.C. Berkeley

  2. About this talk • Take the input, encode with an error-correcting code, and restrict the codeword to a (pseudo)randomly chosen subset of the bits • The above approach works to construct hash functions, randomness extractors, pseudorandom generators, and more • A few different applications of the approach exist, with very different analyses • If the moral of [Vadhan, 2001] is right, all constructed objects are the same, and that same approach works for all is not surprising. Then why differences in analysis?

  3. Disclaims • This talk will • be technically imprecise • lack proper credits and “historic” perspective • have an open finale rather than a happy ending

  4. Cast of Characters • Hash Functionsmap an input to a “random” output • Randomness Extractorsmap a “weakly random” input to a random output • Pseudorandom Generatorsmap a short random input to a long pseudorandom output • Error-correcting codes

  5. Hash Functions H x Hs(x) s For x=/=y, and for random s, Hs(x) very likely to be different from Hs(y).

  6. Error-correcting Codes C x C(x) • Injective map of n bits in N bits (typically, N=O(n)) • If x=/=y, then C(x) and C(y) differ in several places (typically, differ in W(N) positions)

  7. Hash Functions from Error Correcting-Codes • Used several times: ISW00, MV99, M98, . . ., GW94, . . . Hs(x) C(x) x C s y Hs(y) C(y) C s

  8. Analysis • Say that C() maps n bits into N bits, and if x=/=y then C(x) and C(y) differ in N/3 places. • s describes a subset of N/3 of size m • Hs(x) is C(x) “projected” to the bits of s • If x=/=y, then, Prs [Hs (x) = Hs (y)] < (2/3)m • s can be specified using mlog n random bits

  9. Other Observations • Projection points can be chosen along random walk on expander • Collision prob. 2-k achievable with log n + O(k) random bits • Used in one of the hash functions of [Goldreich-Wigderson, 1994]. • In a RAM with division and multiplication, error correcting code and projection (and so, hash function) is computable in O(1) time [Miltersen, 1998]

  10. Extractors E x E(s,x) n bits, entropy k m bits, uniform s d bits, uniform • If x sampled from distribution with min-entropy k,and s uniform, then E(s,x) almost uniform • Similar to hash functions, but want d=O(log n)

  11. Pseudorandom Projections E C(x) E(s,x) x C Proj s

  12. The Nisan-Wigderson projection generator s . . . a4 a1 a3 a2

  13. The Nisan-Wigderson projection generator 0 1 1 0 1 0 1 1 0 s . . . 0 1 1 0 0 1 0 0 0 1 0 1 1 1 0 1 a4 a1 a3 a2

  14. Notions of almost-independence • Standard notion of “almost” independence for random vars A1,…,Am implies: • there are conditional distributions where A1,…,Am-1 are fixed, yet Am has still high entropy • In NW, Am is determined once A1,…,Am-1 • However, in NW, there are conditional distributions where Am is completely random, yet each of A1,…,Am-1 has very low entropy

  15. Properties • Let NW(s,x) be x projected to the coordinates generated from s using the NW generator. • Suppose that D is a procedure that, for a random s, distinguishes NW(s,x) from uniform • Then there is a string x’ “close” to x such that: • x’ has a small description given D • x’ is “efficiently computable” given D and some small amount of additional information

  16. Extractor Based on NW E C(x) E(s,x) x C NW s

  17. Analysis • If it were not a good extractor, there would be a distribution X of high min-entropy and a function D, such that, for a random s, D distinguishes NW(C(X),s) from uniform • For most (fixed) x taken from X, D would distinguish NW(C(x),s) from uniform • For each such x, there is x’ close to C(x) with small description • If X has high min-entropy, with high probability C(X) is not close to a string of small description complexity. • Contradiction: it is a good extractor

  18. B.B. Pseudorandom Generators G f Gf(s) description of function of high circuit complexity m bits, pseudorandom s d bits, uniform • If f has high circuit complexity, and s uniform, then Gf(s) indistinguishable from uniform • Similar to extractors, but with computational requirements

  19. A Construction • Encoding truth-table of function f using error-correcting code based on multivariate polynomials • Project encoded truth-table to a subset of entries chosen using seed s and NW projection generator • Essentially same as extractor seen before • Analysis: • Need the error-correcting code to have a “sub-linear time list-decoding” procedure [STV99] • Need the computational version of the analysis of the NW projection generator

  20. Notes • Things were discovered in reverse order (pseudorandom generator first, extractor later) • In original proof [Impagliazzo-Wigderson, 1997], encoding of f not presented as a good error-correcting code (and analysis does not use list-decoding)

  21. Fully Algebraic Construction? • In NW-based extractor, and in possible implementation of NW-based pseudorandom generator: • Input x (resp., function f) is encoded as a multivariate polynomial p • Seed s is used to generate points a1,…,am • Output is p(a1),…,p(am)[up to minor cheating] • No algebraic meaning to a1,…,am • How about a1,…,am be on a random line?

  22. Miltersen-Vinodchandra • Encode function f as multivariate polynomial p • Use seed s to pick an axis parallel line • Output values of p restricted to the line • Does not give extractor or pseudorandom generator, but (with some more machinery) gives a good hitting set generator • Analysis uses the observation about random projection of a code being good hash function

  23. Ta-Shma-Zuckerman-Safra • Encode input x as a multivariate polynomial p • Use seed s to select an axis-parallel line and a starting point on the line • Output values of p on a few consecutive points on the line, beginning with the starting point • Gives a good extractor • Analysis has similar high-level structure of analysis of NW-based extractor: a distinguisher implies a short description for x • Note: short description not computationally efficient; construction does not imply p.r.g.

  24. Shaltiel-Umans • Encode input x as multivariate polynomial p in Fd • Use seed s to pick generator g of Fd • Evaluate p on g, g2, . . . • Distinguisher implies that x has (computationally efficient) short description • Gives extractors and p.r.g.; performances as good as of best optimized previous constructions

  25. Conclusions? • What choices of pseudorandom projections are good to turn error-correcting codes into extractors / pseudorandom generators, and why? • Do good extractors / prg follow from encoding with multivar polynomials and projecting on parts of a random (non-axis parallel) line? • The NW projections give extractors using any error-correcting codes. Alternative methods with same generality?

More Related