1 / 53

Privacy-Preserving Distributed Information Sharing

Privacy-Preserving Distributed Information Sharing. Nan Zhang and Wei Zhao Texas A&M University, USA. Outline. Motivation Dealing with malicious adversaries Existing and new protocols Conclusion. Information Sharing between Autonomous Entities Problem definition. Knowledge. Supplier

misha
Télécharger la présentation

Privacy-Preserving Distributed Information Sharing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA

  2. Outline • Motivation • Dealing with malicious adversaries • Existing and new protocols • Conclusion

  3. Information Sharing between Autonomous EntitiesProblem definition Knowledge

  4. Supplier Product list Consumer Shopping list Secret Weapon I Secret Weapon I Secret Weapon I Secret Weapon V Secret Weapon II Secret Weapon V … Dream Machine Secret Weapon III Cancer Medicine Secret Weapon IV Perpetual Machine Secret Weapon V … … Example SECRET SECRET Contract

  5. Health Insurance Portability and Accountability Act Privacy Concern • Privacy laws Countries with enacted or pending omnibus privacy laws HIPAA [www.privacy.org, 2002]

  6. Privacy-Preserving Information Sharing • Sharing information across private databases without violating each party’s privacy.

  7. Objectives • To ensure accuracy of information sharing results • To guarantee privacy of each party How do we measure accuracy and privacy?

  8. fails accomplishes 1–la la Measurement of Accuracy • Traditional measure of accuracy 1, if all parties obtain correct information sharing results 0, otherwise • We measure accuracy by the expected value of traditional measure • Probability that all parties obtain correct information sharing results 0 1

  9. undisclosed disclosed 1–lp lp Measurement of Privacy Disclosure • Traditional measure in Cryptography 0, if no privacy disclosure 1, otherwise • Our measure in information sharing • Percentage of private information compromised 0 1

  10. Baseline Architecture • With trusted third party • Without trusted third party TTP

  11. System Architecture INTERNET Local Processing Module Database

  12. INTERNET Local Processing Module Database External Attacks Defense against these attacks can occur by using traditional system security measures

  13. INTERNET Local Processing Module Database Internal AttacksInternal party as adversary

  14. INTERNET Semi-honest Adversaries Private information of the other party • Properly follow the protocol • Record intermediate computation and communication • Passive attack

  15. Protocols Against Semi-honest Adversaries • Almost all existing protocols • Can be efficient • Unrealistic assumption: semi-honest

  16. INTERNET Malicious Adversaries Private information of the other party • Can do whatever it wants • May revise local processing module and/or alter inputs • Active attack

  17. Protocols Against Malicious Adversaries • A few protocols exist, with sporadic restrictions • Inefficient

  18. A Dilemma UNREALISTIC TOO DIFFICULT Semi-honest Malicious

  19. Our Goal: Defend Against Malicious Adversaries Effectively and Efficiently But how?

  20. Continuous accuracy measure Continuous privacy measure fails accomplishes 1–la la Our Approach IGeneralization of privacy & accuracy measures undisclosed 1–lp lp disclosed RECALL RECALL

  21. Priority Behavior Our Approach IIClassification of malicious adversaries

  22. Outline • Motivation • Dealing with malicious adversaries • Existing and new protocols • Conclusion

  23. Classification of Adversaries • Priority of Adversary • To obtain the privacy of other parties • To accomplish information sharing

  24. Supplier Product list Secret Weapon I Secret Weapon V … Secret Weapon I Secret Weapon I Secret Weapon V Secret Weapon II Dream Machine Secret Weapon III Cancer Medicine Secret Weapon IV Perpetual Machine Secret Weapon V … … Adversaries that Care About Information Sharing • Consumer • Shopping list Secret Weapon IV Secret Weapon IV PRIVACY BREACH Consumer needs Secret Weapon IV

  25. Secret Weapon I Secret Weapon I Secret Weapon V Secret Weapon V … … Secret Weapon I Secret Weapon I Secret Weapon V Secret Weapon II Dream Machine Secret Weapon III Cancer Medicine Secret Weapon IV Perpetual Machine Secret Weapon V … … Adversaries that Care About Information Sharing • Supplier • Product list • Consumer • Shopping list Secret Weapon IV Secret Weapon IV Secret Weapon IV An adversary may be penalized if some parties cannot obtain the accurate information sharing results.

  26. Priority of Adversary Information sharing as the first priority Priority of adversary Privacy intrusion as the first priority

  27. Measure of Adversary’s Priority • Priority is measured by , such that the goal of the adversary is to maximize u = (1 – )la+ lp. la : {0,1}, probability that all parties obtain correct information sharing results lp : [0,1], percentage of other parties’ private information that is compromised by the adversary.

  28. Classification of Malicious Adversaries by Their Priority u = (1 – )la+ lp  = 0 Honest 0 <  < 1/2 Weakly malicious Information sharing as the first priority Priority of adversary 1/2   1 Strongly malicious Privacy intrusion as the first priority

  29. Adversary Space Priority Semi-honest Weakly Malicious Strongly Malicious Behavior

  30. Outline • Problem definition • Dealing with malicious adversaries • Existing and new protocols • Conclusion

  31. a a A B AB Protocol DEDouble Encryption • Existing Protocol [R. Agrawal et. al, 2003] • For intersection of two datasets • Basic idea:

  32. Same order A B A A A B A A A A A A A A A A A A A A A B A A A A A B B B Protocol DE • Input: Datasets A, B. Output: AB. Alice Bob A:8 B:10 AB AB AB

  33. Protocol TPSTrust Party with the Smallest Dataset • Our New Protocol I • Basic Idea: TRUST Size: 8 Size: 10

  34. Assumptions • The distribution of the number of data points of each party is known by all parties • For the sake of simplicity, we assume that both parties have the same distribution

  35. A A A A B A A A A B A A A A A A A A B A A B A A A A A A B B Protocol TPS • Input: Datasets A, B. Output: AB. Alice Bob 8 10 A:8 B:10 10 10 8 8 AB AB AB

  36. Protocol RPLReject Parties with the Too Large Dataset • Our New Protocol II • Basic Idea: Reject parties whose datasets are larger than a threshold set by the honest parties

  37. A A A B A A A A B A A B A A B A A A A A B A A A A A A A B A A A A A A A A A AB Protocol RPL • Input: Datasets A, B. Output: AB. Alice Bob A:8 B:10 10 8 Is 10 too large? Is 8 too large? AB AB AB

  38. Lower bound to be secure against semi-honest adversaries Lower bound to be secure against weakly malicious adversaries Performance: Efficiency 4|V0| Communication Overhead 3|V0| 2|V0| DE TPS RPL

  39. Performance: Defense Against Weakly Malicious Adversaries Privacy Disclosure Protocol DE 100 Protocol TPS Protocol RPL 80 Percentage of data compromised by the adversary Accuracy 60 lp(sA, sD0) (%) 100 80 40 Probability that all parties obtain accurate information sharing results 60 la(sA, sD0) (%) 20 40 20 0 0 102 102.3 102.6 102.9 103.2 103.5 TPS RPL DE |V|

  40. Defense Against Strongly Malicious AdversariesPerformance Evaluation Privacy Disclosure Protocol DE Protocol TPS 100 Protocol RPL when  = 10 Protocol RPL when  = 2 80 Protocol RPL when   1 System parameter Penalty / Benefit on Privacy intrusion attack Accuracy 60 lp(sA, sD0) (%) 100 80 40 60 la(sA, sD0) (%) 20 40 20 0 0 102 102.3 102.6 102.9 103.2 103.5 DE TPS RPL |V|

  41. Outline • Problem definition • Dealing with malicious adversaries • Existing and new protocols • Conclusion

  42. Final Remarks • Simple and efficient protocols exist if we • Adopt the continuous measure of privacy disclosure • Constrain the adversary goal to be weakly malicious • Future work • Additional set operation protocols • Multiple correlated attacks

  43. Q&A Thank you

  44. Backup Slides

  45. Weakly and Strongly Malicious u = (1 – )la+ lp  = 0 Honest 0 <  < 1/2 Weakly malicious Information sharing as first priority Priority of adversary If successful intrusion  failed information sharing then the adversary will not perform the intrusion 1/2   1 Strongly malicious Privacy intrusion as first priority

  46. Adversary Classification

  47. Defense Against Weakly Malicious AdversariesMethodology • Goal of adversary: Maximize u =(1 – )la+ lp . • Weakly malicious means < 1/2. • The optimal strategy for weakly malicious adversaries (sA) is to alter its dataset by V1′s.t.V1  V1′ RECALL RECALL If successful intrusion  failed information sharing then the adversary will not perform the intrusion

  48. Basic Idea of Defense Against Weakly Malicious Adversaries • Give them a dilemma No intrusion Successful Information Sharing ? Intrusion Failed Information Sharing Weakly Malicious If successful intrusion  failed information sharing then the adversary will not perform the intrusion RECALL

  49. Alice Alice Eve Eve Justin Justin … … Strongly Malicious Honest Defense AgainstStrongly Malicious Adversaries • We have to sacrifice some honest parties. • Because we cannot distinguish them from strongly malicious adversaries. ?

  50. Privacy Disclosure w/Weakly Malicious Adversaries • When an honest party takes the strategy (sD0) to strictly follow the protocol, there is lp(sA,sD0) Pr{vV0 | vV}/|V|

More Related