1 / 32

Randomness and PSEudorandomness

Randomness and PSEudorandomness. Omer Reingold , Microsoft Research and Weizmann. Randomness and Pseudorandomness. When Randomness is Useful When Randomness can be reduced or eliminated – derandomization Basic Tool: Pseudorandomness

kirsi
Télécharger la présentation

Randomness and PSEudorandomness

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Randomness and PSEudorandomness Omer Reingold, Microsoft Research and Weizmann

  2. Randomness and Pseudorandomness • When Randomness is Useful • When Randomness can be reduced or eliminated – derandomization • Basic Tool: Pseudorandomness • An object is pseudorandom if it “looks random” (indistinguishable from uniform), though it is not. • Expander Graphs

  3. Randomness In Computation (1) • Distributed computing (breaking symmetry) • Cryptography: Secrets, Semantic Security, … • Sampling, Simulations, … Can’t live without you

  4. Randomness In Computation (2) • Communication Complexity (e.g., equality) • Routing (on the cube [Valiant]) - drastically reduces congestion You change my world

  5. Randomness In Computation (3) • In algorithms – useful design tool, but many times can derandomize (e.g., PRIMES in P). Is it always the case? • BPP=P means that every randomized algorithm can be derandomized with only polynomial increase in time • RL=L means that every randomized algorithm can be derandomized with only a constant factor increase in memory Do I really need you?

  6. In Distributed Computing • Dining Philosophers: breaking symmetry Don’t Attack Attack Now

  7. Randomness Saves Communication • Deterministic: need to send the entire file! • Randomness in the Sky: O(1) bits (or log in 1/error) • Private Randomness: Logarithmic number of bits (derandomization). ? = Original File Copy

  8. In Cryptography Private Keys: no randomness - no secrets and no identities Encryption: two encryptions of same message with same key need to be different Randomized (interactive) Proofs: Give rise to wonderful new notions: Zero-Knowledge, PCPs, …

  9. Random Walks and Markov Chains • When in doubt, flip a coin: • Explore graph: minimal memory • Page Rank: stationary distribution of Markov Chains • Sampling vs. Approx counting. Estimating size of Web • Simulations of Physical Systems • …

  10. Shake Your Input • Communication network (n-dimensional cube) Every deterministic routing scheme will incur exponentially busy links (in worse case) Valiant: To send a message from x  y, select node z at random, send x  z  y. Now: O(1) expected load for every edge • Another example – randomized quicksort • Smoothed Analysis: small perturbations, big impact

  11. In Private Data Analysis Hide Presence/Absence of Any Individual How many people in the database have the BC1 gene? Add random noise to true answerdistributed as Lap(/) More questions? More privacy? Need more noise. ratio bounded 0 -4 -3 -2 -  2 3 4

  12. Randomness and Pseudorandomness • WhenRandomness is Useful • When Randomness can be reduced or eliminated – derandomization • Basic Tool: Pseudorandomness • An object is pseudorandom if it “looks random” (indistinguishable from uniform), though it is not. • Expander Graphs

  13. Cryptography: Good Pseudorandom Generators are Crucial • With them, we have one-time pad (and more): • Without, keys are bad, algorithms are worthless (theoretical& practical) short key K0:110 derived key K:01100100 D E plaintext data: (ciphertext K)= 00001111 plaintext data:00001111 ciphertext= plaintext  K=01101011

  14. Data Structures & Hash Functions Linear Probing: • If F is random then insertion time and query time are O(1) (in expectation). • But where do you store a random function ?!? Derandomize! • Heuristic: use SHA1, MD4, … • Recently (2007): 5-wise independent functions are sufficient* • Similar considerations all over: bloom filters, cuckoo hashing, bit-vectors, … Bob F(Bob)

  15. Weak Sources & Randomness Extractors • Available random bits are biased and correlated • Von Neumann sources: • Randomness Extractors produce randomness from general weak sources, many other applications b1 b2 …bi … are i.i.d. 0/1 variablesand bi =1with someprobabilityp < 1then translate 01 1 10 0

  16. Algorithms: Can Randomness Save Time or Memory? • Conjecture - No* (*moderate overheads may still apply) • Examples of derandomization: • Holdouts: Identity testing, approximation algorithms, … Primality Testing in Polynomial Time Graph Connectivity logarithmic Memory

  17. N N S, |S| K |(S)|  A |S| (A > 1) D (Bipartite) Expander Graphs Important: every (not too large) set expands.

  18. N N S, |S| K |(S)|  A |S| (A > 1) D (Bipartite) Expander Graphs • Main goal: minimize D(i.e. constant D) • Degree 3 random graphs are expanders! [Pin73]

  19. N N S, |S| K |(S)|  A |S| (A > 1) D (Bipartite) Expander Graphs Also: maximize A. • Trivial upper bound: A  D • even A ≲ D-1 • Random graphs: AD-1

  20. Applications of Expanders These “innocent” looking objects are intimately related to various fundamental problems: • Network design (fault tolerance), • Sorting networks, • Complexity and proof theory, • Derandomization, • Error correcting codes, • Cryptography, • Ramsey theory • And more ...

  21. Non-blocking Network with On-line Path Selection [ALM] N (Inputs) N (Outputs) Depth O(log N), size O(N log N), bounded degree. Allows connection between input nodes and output nodes using vertex disjoint paths.

  22. Non-blocking Network with On-line Path Selection [ALM] N (Inputs) N (Outputs) • Every request for connection (or disconnection) is satisfied in O(log N) bit steps: • On line. Handles many requests in parallel.

  23. “Lossless” Expander The Network N (Inputs) N (outputs)

  24. N M= N D S, |S| K |(S)| 0.9 D |S| 0< 1 is an arbitrary constant D is constant & K= (M/D) =  (N/D). Slightly Unbalanced, “Lossless” Expanders [CRVW 02]: such expanders (with D = polylog(1/))

  25. Unique neighbor of S Non Unique neighbor Property 1: A Very Strong Unique Neighbor Property S, |S| K, |(S)| 0.9 D |S| S S has 0.8 D |S| unique neighbors !

  26. S` Step I: match S to its unique neighbors. Continue recursively with unmatched vertices S’. Using Unique Neighbors for Distributed Routing Task: match S to its neighbors (|S| K) S

  27. Adding new paths: think of vertices used by previous paths as faulty. Reminder: The Network

  28. Remains a lossless expander even if adversary removes (0.7 D) edges from each vertex. Property 2: Incredibly Fault Tolerant S, |S| K, |(S)| 0.9 D |S|

  29. + + 1 0 0 + 0 1 + 1 Simple Expander Codes [G63,Z71,ZP76,T81,SS96] M= N (Parity Checks) N (Variables) Linear code; Rate 1 – M/N= (1 -  ). Minimum distanceK. Relative distanceK/N= ( / D) =  / polylog (1/). For small  beats the Zyablov bound and is quite close to the Gilbert-Varshamov bound of  / log (1/).

  30. + + 1 Error set B, |B| K/2 0 |(B)| > .9 D |B| 0 1 + |(B)Sat|< .2 D|B| 1 0 + 0 1 1 1 0 0 Simple Decoding Algorithm in Linear Time (& log n parallel phases) [SS 96] N (Variables) • Algorithm: At each phase, flip every variable that “sees” a majority of 1’s (i.e, unsatisfied constraints). M= N (Constraints) |Flip\B| |B|/4|B\Flip| |B|/4 |Bnew||B|/2

  31. x1 x2 xi Random Walk on Expanders [AKS 87] xi converges to uniform fast (for arbitrary x0). For a random x0: the sequencex0, x1, x2 . . . has interesting “random-like” properties. ... x0

  32. Thanks

More Related