1 / 34

2-source Dispersers for n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

2-source Dispersers for n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction. Boaz Barak Anup Rao Ronen Shaltiel Avi Wigderson. Plan for this talk. Introduction: Ramsey Graphs. Randomness extractors. 2-source extractors/dispersers and their relation to Ramsey graphs.

larya
Télécharger la présentation

2-source Dispersers for n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2-source Dispersers for no(1)entropy and Ramsey graphs beating the Frankl-Wilson construction Boaz Barak Anup Rao Ronen Shaltiel Avi Wigderson

  2. Plan for this talk • Introduction: • Ramsey Graphs. • Randomness extractors. • 2-source extractors/dispersers and their relation to Ramsey graphs. • High level overview of our construction.

  3. Ramsey Graphs • K-Ramsey graphs are graphs which contain no cliques or anti-cliques of size K. • [Erdos 1947]: There exists a graph on N vertices with no cliques (anti-cliques) of size (2+o(1))log N. • One of the first applications of the probabilistic method! • Erdos also asked whether such graphs can be explicitly constructed. • Best explicit construction [Frankl and Wilson]:

  4. Ramsey Graph: No large monochromatic rectangles of form X x X. Bipartite Ramsey Graph: No large monochromatic rectangles of form X x Y. Every matrix of a bipartite Ramsey Graph is a matrix of a Ramsey Graph. Nonexplicit result: O(log N) Known explicit [CG85]: √N [PR04]: o(√N). [BKSSW05]: Nδfor every δ>0. Our result: exp(logδN) for every δ>0. N N X X Y Ramsey Graphs (Viewed as adjacency matrices) N

  5. A new construction of Ramsey Graphs beating Frankl-Wilson Convention N=2n. Identify {1..N}≈{0,1}n. Theorem: There is a polynomial time computable function R:{0,1}n x {0,1}n -> {0,1} Such that for every δ>0 and X,Y in {0,1}n of size K=exp(logδN)=exp(nδ): R(X,Y)={0,1}. Strongly explicit construction

  6. Yet another slide on motivation for extractors Do we really have to tell that old story again? Daddy, how do computers get random bits?

  7. Computers have access to sources of randomness: Electric noise Key strokes of user Timing of past events These distributions are “somewhat random” but not “truly random”. Solution: Randomness Extractors Randomness Extractor Randomness Extractors: How can computers get random bits? Somewhat random random coins Probabilistic algorithm input output

  8. X x (n-bit strings) Notion of entropy • Somewhat random distributions must “contain randomness”. • Right notion (min-entropy): the min-entropy of a distribution X is the largest k such that Pr[X=x] ≤2-k. • Unjustified assumption for this talk (with loss of generality): entropy = min-entropy. • Notation: • Dfn: rate = entropy / length =k/n. • Flat distributions are uniform over some subset. • entropy = log(set size).

  9. The dream extractor 1-source extractor X x (n-bit strings) We want of ext: whenever H(X)>m   ext(X)~Um ext Problem:No such thing! (m-bit strings) alg I=(011…010)

  10. Seeded extractor [NZ93] X x (n-bit strings) ext (m-bit strings) alg I=(011…010)

  11. 5 1 4 2 3 Seeded extractor [NZ93] X x Good for: • Simulating BPPusing weak sources. Problems: • Doesn’t work for cryptography (n-bit strings) ext poly(n) outputs mostoutputs are random (m-bit strings) alg I=(011…010)

  12. 2-source extractor [SV86] x Y X y (n-bit strings) Whenever:X, Y independent H(X), H(Y) ≥ k  ext(X,Y)~Um ext (m-bit strings) Such things exist!

  13. Consider 2-source extractors for independent distributions X and Y with entropy k. Namely a function Ext(x,y) (say into one bit). Requirement: No size K=2k unbalanced X x Y rectangles. 2-source extractor ⇒ bipartite Ramsey graph ⇒ Ramsey graph . X x y ext 0/1 Y 2-source extractors and bipartite Ramsey graphs N x y

  14. Definitions of 2-source extractors and dispersers • A 2-source extractor for entropy k is a function Ext(x,y) such that for any two independent distributions X,Y with entropy > k the output distribution Ext(X,Y) is close to uniform. • bipartite Ramsey graph = 2-source disperser • A 2-source disperser for entropy k is a function Dis(x,y)∊{0,1} such that for any two independent distributions X,Y with entropy > k the output distribution Dis(X,Y)={0,1}. • Our main result: Disperser for entropy k=nδ for every δ>0.

  15. Summary and plan • Main result: • An explicit 2-source disperser for entropy k=nδ for every δ>0. • (This gives Ramsey graphs that beat the Frankl-Wilson construction which achieves δ=½). • The construction and its analysis are quite involved. • Disclaimer: I will oversimplify in order to try and highlight the main ideas. • Plan: • Somewhere random 2-source extractors. • Block-wise sources. • Testing entropy. • Recursive construction of somewhere random 2-source extractors. • Run out of time… • Construction of TestBlock procedure. • Something about the final disperser.

  16. 5 1 4 2 3 2-source somewhere extractor [BKSSW05] x Y X y (n-bit strings) Whenever:X, Y independent H(X), H(Y) ≥ k  ∃i : ext(X,Y)i~Um Important step: somewhere extractor for entropy k=nδ with nε outputs (for 0<ε<δ) SE (m-bit strings) Remainder of talk: High level description of our construction of somewhere extractor. More ideas are needed to get a disperser

  17. Block-wise sources [CG88] A block wise source with C blocks and entropy k is a distribution X1,..,XC s.t for all i: H(Xi|X1,..,Xi-1)>k • Entropy versus min-entropy. • It’s often easier to extract from block wise sources than from general sources. • Extractor for 2 independent block-wise sources with entropy k and C=O(log n/log k) blocks [Rao06]. • Constant number C of blocks for k=nδ. • Our result: achieve the same with one block-wise source and one general source. (we refer to it as basic-ext). • Important building block. Relies on [Rao06,Raz05,Bou05]. X1 X2 X3

  18. Roadmap • Goal: Given parameters 0<ε<δ construct a 2-source somewhere extractor for entropy k=nδwith nε outputs. • Following previous work on seeded extractors [NZ93,SZ94,SSZ95,…] given two sources we try to convert one of them into a block-wise source.

  19. n n n X X Y X1 X2 X3 .. .. Xt 2-source somewhere extractors for large k>>n1-ε k>length of C blocks • Split X into t=nε blocks. • Assume that k>2Cn1-ε=2Cn/t>length of C blocks. • Chain rule*ΣH(Xi|X1,..,Xi-1)≥k. • ⇒∃i1,..,iC s.t. Xi1,..,Xic is a block-wise source with high (n1-2ε) entropy (roughly the same rate). • SE(x,y): Go over all tC=nεC candidate block-wise sources. For each one run basic-ext and collect all nO(ε)outputs. • Also works when H(Y)=nδ.

  20. n n n X X Y X1 X2 X3 .. .. Xt 2-source somewhere extractors for small k=nδ As k<n/t all the entropy can be in one block • Split X into t=nε blocks. • We say that a block Xi has • medium entropy if H(Xi|X1,..,Xi-1)≥ k/2t (same rate). • high entropy if H(Xi|X1,..,Xi-1)≥k/2C (More condensed ≈rate ∙ t). • Previous slide: large k ⇒ must exist C medium blocks. • Win-win analysis: one of two cases occurs*: • Exist Cmedium blocks. (∃block-wise source). • Exists a high block. (∃block i with rate(Xi|X1,..,Xi-1) ≥rate(X)∙ Ω(t)). • Goal: In 2nd case, identify the high block and continue recursively. • Eventually we will get a block-wise source! • Nevertheless, we will try implement this strategy! We only gets samples x,y. How can we learn something about the entropy of X,Y?

  21. n n n X X Y X1 X2 X3 .. .. Xt Testing blocks for entropy (fantasy object) • We want a procedure that tests if rate(Xi|X1,..,Xi-1)≥r: • TestBlockr,i(x,y) s.t. • Given 2 independent sources X,Y with sufficient entropy. • If rate(Xi|X1,..,Xi-1)≥r,TestBlockr,i(X,Y) passes w.h.p. • If rate(Xi|X1,..,Xi-1)<r, TestBlockr,i(X,Y) fails w.h.p. • Disclaimer: oversimplified and too good to be true. • Nevertheless, we can get something with same flavor. • We show*: “2-source somewhere-extractors for some rate r (with few outputs) give TestBlock for rate r”. • But we want to use TestBlock inside such a construction!?

  22. To operate on rate r we only require testing high blocks (rate r’≈r∙t) allows recursion. Recursive construction of 2-source somewhere extractor • Given entropy rate r, assume by recursion that we have a 2-source somewhere extractor SE’ for larger rate r’≈r∙t. • ⇒ We can run TestBlock with rate r’. (We can test if a block Xi in a source X with rate r is a high block). • Construction of SE(x,y) (for rate r) • Go over all tC=nεC candidate block-wise sources. For each one run basic-ext and collect all outputs. • Solves case of C medium entropy blocks. • For i=1..t, run TestBlockr’,i(x,y) to see if Xi has high ent. • Run SE’(xi,y) on the first i on which TestBlock passes. • Solves the case of a high entropy block.

  23. Summary and plan • We’ve seen: Construction of 2-source somewhere random extractor for entropyk=nδ with nε outputs (for any constants 0<ε<δ). • Component: TestBlock a procedure that tests whether rate(Xi|X1,..,Xi-1)≥r. • Next: • Precise properties of TestBlock. • How to construct TestBlock from a 2-source somewhere extractor with nε outputs. • Subsources.

  24. X’ Subsources Let X be a flat distribution. A distribution X’ is a subsource of X if X’ is flat and X’⊆X. X’ hasdeficiency d if |X’|/|X|≥ 2-d. Fact: If H(X) ≥ k and X’ is a subsource of X with deficiency d then H(X’) ≥ k – d. X

  25. A subsource2-sourceextractor for entropy k is a function Ext(x,y) s.t. for any two independent distributions X,Y with entropy > kthere exist large independent subsourcesX’,Y’ s.t. Ext(X’,Y’) is close to uniform. The extractor is not required to succeed onX,Y but rather on some large subsources X’,Y’. A subsource2-source extractor is a 2-source disperser. Extend definition to subsource somewhere extractor. X Y’ Y X’ Subsource 2-source extractors (succeed on some subsource) N

  26. n n n X X Y X1 X2 X3 .. .. Xt Testing blocks for entropy. (precise version with subsources) TestBlockr,i(x,y) (tests if rate(Xi|X1,..,Xi-1)≥r) • Given two independent sources X,Y with sufficient entropy. • If Rate(Xi|X1,..,Xi-1)≥r,∃subsources X’,Y’ s.t. • TestBlockr,i(X’,Y’) passes w.h.p. • H(X’i|X’1,..,X’i-1)≈H(Xi|X1,..,Xi-1), H(Y’)≈H(Y). • If Rate(Xi|X1,..,Xi-1)<r, ∃subsources X’,Y’ s.t. • TestBlockr,i(X’,Y’) fails w.h.p. • For j>i, H(X’j|X’1,..,X’j-1)≈H(Xj|X1,..,Xj-1), H(Y’)≈H(Y).

  27. Recursive construction of subsource 2-source somewhere extractor • Given entropy rate r, assume by recursion that we have a subsource 2-source somewhere extractor SE’ for larger rate r’≈r∙t. • ⇒* We can run TestBlock with rate r’. • Construction of SE(x,y). • Go over all tC=nεC candidate block-wise sources. For each one run basic-ext and collect all outputs. • Solves case of C medium entropy blocks. • For i=1..t, run TestBlockr’,i(x,y). • Run SE’(xi,y) on the first i on which TestBlock passes. • Solves the case of high entropy block on a subsource.

  28. n n n X Y X X1 X2 X3 .. .. Xt Component: somewhere extractor for rate r with nεoutputs of length nε|C|=nε∙ nε=n2ε Component: specially designed somewhere extractor with poly(n) outputs and additional propertiespoly-SE(x,y)j=Vaz(E(x,j),E(y,j)) Different trom [BKSSW05] Testing blocks for entropy. The challenge response method [BKSSW05] TestBlockr,i(x,y) tests whether rate(Xi|X1,..,Xi-1)≥r • C=SE’(xi,y) • Rj=poly-SE(x,y)j • TestBlock passes if ∀j : Rj≠C. C1 C C2 C3 R1 R2 R3 We didn’t use subsources!? • If Rate(Xi|X1,..,Xi-1)≥r,∃subsources X’,Y’ s.t. • TestBlockr,i(X’,Y’) passes w.h.p. • H(X’i|X’1,..,X’i-1)≈H(Xi|X1,..,Xi-1), H(Y’)≈H(Y). • Rate(Xi|X1,..,Xi-1)≥r ⇒ ∃k : Ckis random ⇒ H(C) large. • Special properties of poly-SE ⇒ ∀j : H(C|Rj) large. • w.h.p ∀j : Rj≠C.

  29. n n n X X Y X1 X2 X3 .. .. Xt Component: somewhere extractor for rate r with nεoutputs of length nε|C|=nε∙ nε=n2ε Component: specially designed somewhere extractor with poly(n) outputs and additional propertiespoly-SE(x,y)j=Vaz(E(x,j),E(y,j)) Different from [BKSSW05] Testing blocks for entropy. The challenge response method [BKSSW05] TestBlockr,i(x,y) tests whether rate(Xi|X1,..,Xi-1)≥r • C=SE(xi,y) • Rj=poly-SE(x,y)j • TestBlock passes if ∀j : Rj≠C. C C1 C2 C3 R1 R2 R3 Use special properties of poly-SE • If Rate(Xi|X1,..,Xi-1)<r, ∃subsources X’,Y’ s.t. • TestBlockr,i(X’,Y’) fails w.h.p. • For j>i, H(X’j|X’1,..,X’j-1)≈H(Xj|X1,..,Xj-1), H(Y’)≈H(Y). • Rate(Xi)<r ⇒ can fix Xiand still have entropy left in X’. • C is a function of Y ⇒ ∃subsource Y’ s.t. C is constant. • X’,Y’ independent ⇒ ∃j : Rjis random ⇒ Pr[Rj=C]≥2-|C| • ⇒ Can lose |C| bits and go to subsources on which Rj=C.

  30. Story so far • Goal: 2-source disperser for entropy k=nδ. • Component: subsource 2-source somewhere extractor for entropy k=nδwith nεoutputs. • Recursive win-win construction. • Component: TestBlockr,i(x,y) tests if rate(Xi|X1,..,Xi-1)≥r • Constructed using the subsource 2-source somewhere extractor(from the recursion hypothesis).

  31. Constructing a (1-output) disperser • Having constructed a (subsource) somewhere extractor for entropy nδ we can run TestBlock to test whether Xi is a medium entropy block. • We observe that on a medium block Xi s.t. • Pr[TestBlocki(X,Y) passes]> 1-o(1). (on subsource) • Pr[TestBlocki(X,Y) fails] > exp(-nε) (on same subsource) • TestBlock outputs two different values! • This is close to a disperser!

  32. n n n X X Y X1 X2 X3 .. .. Xt High level idea for disperser Disperser(x,y): for i=1 to t • Test the entropy of i’th block. • If block has low entropy continue. • If block has high entropy recurs on it. (output Disperser(xi,y).) • If block has medium entropy run TestBlock on the block and output pass/fail. This requires designing a more complicated TestBlock function with 4 possible outputs. Construction and analysis use ideas similar to previous construction but are more involved.

  33. Conclusions and open problems • We were able to construct a 2-source disperser for entropy no(1). • Equivalently K-Ramsey graphs for K=exp(logo(1)N). Open problems: • Construct a 2-source extractor for entropy rate <0.4999. (Improve [Bou05]). • Construct a disperser for entropy polylog n. • Main open problem: Simplify construction and proof.

  34. That’s it

More Related