1 / 72

The Promise of Differential Privacy

The Promise of Differential Privacy. Cynthia Dwork, Microsoft Research . TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A. NOT A History Lesson. Developments presented out of historical order; key results omitted. NOT Encyclopedic.

yama
Télécharger la présentation

The Promise of Differential Privacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Promise of Differential Privacy Cynthia Dwork, Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAA

  2. NOT A History Lesson Developments presented out of historical order; key results omitted

  3. NOT Encyclopedic Whole sub-areas omitted

  4. Outline • Part 1: Basics • Smoking causes cancer • Definition • Laplace mechanism • Simple composition • Histogram example • Advanced composition • Part 2: Many Queries • Sparse Vector • Multiplicative Weights • Boosting for queries • Part 3: Techniques • Exponential mechanism and application • Subsample-and-Aggregate • Propose-Test-Release • Application of S&A and PTR combined • Future Directions

  5. Basics Model, definition, one mechanism, two examples, composition theorem

  6. ? Model for This Tutorial • Database is a collection of rows • One per person in the database • Adversary/User and curator computationally unbounded • All users are part of one giant adversary • “Curator against the world” C

  7. Databases that Teach • Database teaches that smoking causes cancer. • Smoker S’s insurance premiums rise. • This is true even if S not in database! • Learning that smoking causes cancer is the whole point. • Smoker S enrolls in a smoking cessation program. • Differential privacy: limit harms to the teachings, not participation • The outcome of any analysis is essentially equally likely, independent of whether any individual joins, or refrains from joining, the dataset. • Automatically immune to linkage attacks

  8. Pr [response] Z Z Z Bad Responses: Differential Privacy [D., McSherry, Nissim, Smith 06] M gives (ε,0) -differential privacy if for all adjacent x and x’, and all C µ range(M): Pr[ M (x) 2 C] ≤e Pr[ M (x’) 2 C] Neutralizes all linkage attacks. Composes unconditionally and automatically: Σiε i ratio bounded

  9. Pr [response] Z Z Z Bad Responses: (, d) - Differential Privacy M gives (ε,d) -differential privacy if for all adjacent x and x’, and all C µ range(M ): Pr[ M (D) 2 C] ≤e Pr[ M (D’) 2 C] + d Neutralizes all linkage attacks. Composes unconditionally and automatically: (Σii ,Σidi ) ratio bounded This talk: negligible

  10. Range Equivalently, Useful Lemma [D., Rothblum, Vadhan’10]: Privacy loss bounded by expected loss bounded by 2 “Privacy Loss”

  11. Sensitivity of a Function Adjacent databases differ in at most one row. Counting queries have sensitivity 1. Sensitivity captures how much one person’s data can affect output • f = maxadjacentx,x’ |f(x) – f(x’)|

  12. Laplace Distribution Lap(b) p(z) = exp(-|z|/b)/2b variance = 2b2 ¾ = √2 b Increasing b flattens curve

  13. Calibrate Noise to Sensitivity f = maxadjx,x’ |f(x) – f(x’)| Theorem [DMNS06]: On query f, to achieve -differential privacy, use scaled symmetric noise [Lap(b)] with b = f/. 0 -4b -3b -2b -b b 2b 3b 4b 5b Noise depends on f and , not on the database Smaller sensitivity (f) means less distortion

  14. Example: Counting Queries • How many people in the database satisfy property ? • Sensitivity = 1 • Sufficient to add noise • What about multiple counting queries? • It depends.

  15. Vector-Valued Queries f = maxadj x, x’||f(x) – f(x’)||1 Theorem [DMNS06]: On query f, to achieve -differential privacy, use scaled symmetric noise [Lap(f/)]d. 0 -4b -3b -2b -b b 2b 3b 4b 5b Noise depends on f and , not on the database Smaller sensitivity (f) means less distortion

  16. Example: Histograms f = maxadj x, x’ ||f(x) – f(x’)||1 Theorem: To achieve -differential privacy, use scaled symmetric noise [Lap(f/)]d.

  17. Pr[ M(f, x – Me) = t] =exp(-(||t- f-||-||t- f+||)/R)≤ exp(f/R) Pr[ M (f, x+ Me) = t] Why Does it Work ? f = maxx, Me||f(x+Me) – f(x-Me)||1 Theorem: To achieve -differential privacy, add scaled symmetric noise [Lap(f/)]. 0 -4b -3b -2b -b b 2b 3b 4b 5b

  18. “Simple” Composition • k-fold composition of (,±)-differentially private mechanisms is (k, k±)-differentially private.

  19. Composition [D., Rothblum,Vadhan’10] • Qualitively: Formalize Composition • Multiple, adaptively and adversarially generated databases and mechanisms • What is Bob’s lifetime exposure risk? • Eg, for a 1-dp lifetime in 10,000 ²-dp or (,±)-dp databases • What should be the value of ²? • Quantitatively • : the -fold composition of -dp mechanisms is -dp • rather than

  20. Adversary’s Goal: Guess b Choose b ² {0,1} Adversary (x1,0, x1,1), M1 (x2,0, x2,1), M2 (xk,0, xk,1), Mk M2(x2,b) Mk(xk,b) M1(x1,b) b = 0 is real world b=1 is world in which Bob’s data replaced with junk …

  21. Flavor of Privacy Proof • Recall “Useful Lemma”: • Privacy loss bounded by expected loss bounded by • Model cumulative privacy loss as a Martingale [Dinur,D.,Nissim’03] • Bound on max loss () • Bound on expected loss (22) • PrM1,…,Mk[ | iloss from Mi| > z A + kB] < exp(-z2/2) A B

  22. Extension to (,d)-dp mechanisms • Reduce to previous case via “dense model theorem” [MPRV09] (, d)-dp Y z (, d)-dp d- close (,0)-dp Y’

  23. Composition Theorem • : the -fold composition of -dp mechanisms is -dp • What is Bob’s lifetime exposure risk? • Eg, 10,000 -dp or -dpdatabases, for lifetime cost of -dp • What should be the value of ? • 1/801 • OMG, that is small! Can we do better? • Can answer low-sensitivity queries with distortion o) • Tight [Dinur-Nissim’03 &ff.] • Can answer low-sensitivity queries with distortion o(n) • Tight? No. And Yes.

  24. Outline • Part 1: Basics • Smoking causes cancer • Definition • Laplace mechanism • Simple composition • Histogram example • Advanced composition • Part 2: Many Queries • Sparse Vector • Multiplicative Weights • Boosting for queries • Part 3: Techniques • Exponential mechanism and application • Subsample-and-Aggregate • Propose-Test-Release • Application of S&A and PTR combined • Future Directions

  25. Many Queries Sparse Vector; Private Multiplicative Weights, Boosting for Queries

  26. Caveat: Omitting polylog(various things, some of them big) terms Error [Hardt-Rothblum] Runtime Exp(|U|)

  27. Sparse Vector • Database size • # Queries , eg, super-polynomial in • # “Significant” Queries • For now: Counting queries only • Significant: count exceeds publicly known threshold • Goal: Find, and optionally release, counts for significant queries, paying only for significant queries insignificant insig insignificant insig insig

  28. Algorithm and Privacy Analysis [Hardt-Rothblum] Algorithm: When given query : • If : [insignificant] • Output • Otherwise [significant] • Output Caution: Conditional branch leaks private information! Need noisy threshold • First attempt: It’s obvious, right? • Number of significant queries invocations of Laplace mechanism • Can choose so as to get error

  29. Algorithm and Privacy Analysis Algorithm: When given query : • If : [insignificant] • Output • Otherwise [significant] • Output Caution: Conditional branch leaks private information! • Intuition: counts far below T leak nothing • Only charge for noisy counts in this range:

  30. Let • denote adjacent databases • denote distribution on transcripts on input • denote distribution on transcripts on input • Sample • Consider • Show Fact: (3) implies -differential privacy

  31. Write as “privacy loss in round ” Define borderline event on noise as “a potential query release on ” Analyze privacy loss inside and outside of

  32. Borderline eventCase Release condition: Borderline event Definition of Mass to the left of = Mass to the right of • Properties • Conditioned on round t is a release with prob • Conditioned on we have • Conditioned on we have

  33. Borderline eventCase Release condition: Borderline event Definition of Mass to the left of = Mass to the right of • Properties • Conditioned on round t is a release with prob • Conditioned on we have • Conditioned on we have Think about x’ s.t.

  34. Borderline eventCase Release condition: Borderline event • Properties • Conditioned on round t is a release with prob • Conditioned on we have • (vacuous: Conditioned on we have )

  35. Properties • Conditioned on round t is a release with prob • Conditioned on we have • Conditioned on we have By (2,3),UL,+Lemma, ) By (1), E[#borderline rounds] = #releases

  36. Wrapping Up: Sparse Vector Analysis • Probability of (significantly) exceeding expected number of borderline events is negligible (Chernoff) • Assuming not exceeded: Use Azuma to argue that whp actual total loss does not significantly exceed expected total loss • Utility: With probability at least all errors are bounded by . • Choose Expected total privacy loss

  37. Private Multiplicative Weights [Hardt-Rothblum’10] • Theorem (Main). There is an -differentially private mechanism answering linear online queries over a universe U and database of size in • time per query • error . Represent database as (normalized) histogram on U

  38. MultiplicativeWeights Recipe (Delicious privacy-preserving mechanism): Maintain public histogram (with uniform) For each • Receive query • Output if it’s already accurate answer • Otherwise, output • and “improve” histogram How to improve ?

  39. Estimate Input Query 1 . . . 0 1 2 3 N 4 5 Before update Suppose

  40. Estimate Input Query 1 . . . 0 1 2 3 N 4 5 x 1.3 x 1.3 x 0.7 x 0.7 x 1.3 x 0.7 After update

  41. Algorithm: • Input histogram with • Maintain histogram with being uniform • Parameters • When given query : • If : [insignificant] • Output • Otherwise [significant; update] • Output • where ) • Renormalize

  42. Analysis • Utility Analysis • Few update rounds • Allows us to choose • Potential argument [Littlestone-Warmuth’94] • Uses linearity • Privacy Analysis • Same as in Sparse Vector!

  43. Caveat: Omitting polylog(various things, some of them big) terms Error [Hardt-Rothblum] Runtime Exp(|U|)

  44. Boosting[Schapire, 1989] • General method for improving accuracy of any given learning algorithm • Example: Learning to recognize spam e-mail • “Base learner” receives labeled examples, outputs heuristic • Run many times; combine the resulting heuristics

  45. S: Labeled examples from D Base Learner Does well on ½ + ´ ofD A A1, A2, … Combine A1, A2, … Update D Terminate?

  46. S: Labeled examples from D Base learner only sees samples, not all of D Base Learner Does well on ½ + ´ ofD A A1, A2, … Combine A1, A2, … Update D Terminate? How?

  47. Boosting for Queries? • Goal: Given database x and a set Q of low-sensitivity queries, produce an object O such that 8 q 2Q : can extract from O an approximation of q(x). • Assume existence of (²0, ±0)-dp Base Learner producing an object O that does well on more than half of D • Pr q» D [ |q(O) – q(DB)| < ¸ ] > (1/2 + ´)

  48. Initially: D uniform on Q S: Labeled examples from D Base Learner Does well on ½ + ´ ofD A A1, A2, … Combine A1, A2, … Update D

  49. (truth) 1 . . . 0 1 2 3 |Q | 4 5 Before update

  50. (truth) 1 . . . 0 1 2 3 |Q | 4 5 x 0.7 x 1.3 x 0.7 x 0.7 x 1.3 x 0.7 After update increased where disparity is large, decreased elsewhere

More Related