1 / 30

Differential Privacy Xintao Wu Oct 31, 2012

Differential Privacy Xintao Wu Oct 31, 2012. Sanitization approaches. Input perturbation Add noise to data Generalize data Summary statistics Means, variances Marginal totals Model parameters Output perturbation Add noise to summary statistics. Blending/hiding into a crowd.

kyle
Télécharger la présentation

Differential Privacy Xintao Wu Oct 31, 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Differential PrivacyXintao WuOct 31, 2012

  2. Sanitization approaches • Input perturbation • Add noise to data • Generalize data • Summary statistics • Means, variances • Marginal totals • Model parameters • Output perturbation • Add noise to summary statistics

  3. Blending/hiding into a crowd • K-anonymity based approaches • Adversary may have various background knowledge to breach privacy • Privacy models often assume “the adversary’s background knowledge is given”

  4. Classic intuition for privacy • Privacy means that anything can be learned about a respondent from the statistical database can be learned without access to the database. • Security of encryption • Anything about the plaintext that can be learned from a ciphertext can be learned without the ciphertext. • Prior and posterior views about an individual should not change much

  5. Motivation • Publicly release statistical information about a dataset without compromising the privacy of any individual

  6. Requirement • Anything that can be learned about a respondent from a statistical database should be learnable without access to the database • Reduce the knowledge gain of joining the database • Require that the probability distribution on the public results is essentially the same independent of whether any individual opts in to, or opts out of the dataset

  7. Definition

  8. Sensitivity function • Captures how great a difference must be hidden by the additive noise

  9. LAP distribution noise

  10. Guassian noise

  11. Adding LAP noise

  12. Proof sketch

  13. Delta_f=1, epsilon varies

  14. Delta_f=1 epsilon=0.01

  15. Delta_f=1 epsilon=0.1

  16. Delta_f=1 epsilon=1

  17. Delta_f=1 epsilon=2

  18. Delta_f=1 epsilon=10

  19. Delta_f=2, epsilon varies

  20. Delta_f=3, epsilon varies

  21. Delta_f=10000, epsilon varies

  22. Composition • Sequential composition • Parallel composition --for disjoint sets, the ultimate privacy guarantee depends only on the worst of the guarantees of each analysis, not the sum.

  23. Example • Let us assume a table with 1000 customers and each record has attributes: name, gender, city, cancer, salary. • For attribute city, we assume the domain size is 10; • for attribute cancer, we only record Yes or No for each customer; • for attribute salary, the domain range is 0-10k. • The privacy threshold \epsilon is a constant 0.1 set by data owner. • For one single query “How many customers got cancer?” • The adversary is allowed to ask three times of the query shown the above.

  24. Example (continued) • “How many customers got cancer in each city?” • For one single query “What is the sum of salaries across all customers?”

  25. Type of computing (query) • some are very sensitive, others are not • single query vs. query sequence • query on disjoint sets or not • outcome expected: number vs. arbitrary • interactive vs. not interactive

  26. Sensitivity • Global sensitivity • Local sensitivity • Smooth sensitivity

  27. Different areas of DP • PINQ • DM with DP • Optimizing linear counting queries under differential privacy. -Matrix mechanism for answering a workload of predicate counting queries

  28. PPDM interface--PINQ • A programmable privacy preserving layer • Add calibrated noise to each query • Need to assign privacy cost budget

  29. Data Mining with DP • Previous study—privacy preserving interface ensures everything about DP • Problems—inferior results if the interface is utilized simply during data mining • Solution—consider both together • DP ID3 —noisy count —evaluate all attributes in one exponential mechanism query using entire budget instead of splitting budget among multiple

  30. DP in Social Networks • Page 97-120 of pakdd11 tutorial

More Related