1 / 84

Evaluating LBS Privacy In DYNAMIC CONTEXT

Evaluating LBS Privacy In DYNAMIC CONTEXT. Outline. Introduction Overview Attack Model Classification Defend Model Evaluation Module Conclusion. Outline. Introduction Overview Attack Model Classification Defend Model Evaluation Module Conclusion. What we need now?.

tamber
Télécharger la présentation

Evaluating LBS Privacy In DYNAMIC CONTEXT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating LBS Privacy In DYNAMIC CONTEXT

  2. Outline • Introduction • Overview Attack Model • Classification Defend Model • Evaluation Module • Conclusion

  3. Outline • Introduction • Overview Attack Model • Classification Defend Model • Evaluation Module • Conclusion

  4. What we need now? • Problem of privacy preserving in a context-aware environment? • Different services require different algorithms • Even in the only one service?  How to solve it? • Providing suitable privacy preserving services (algorithms) after forecast of privacy concern • Evaluating result from privacy preserving services (algorithms) then refine it

  5. Key problem • Provide a privacy protection level which is suitable to the context • rather than increasing the privacy protection level unconditionally • Each service provider obtain the current privacy concern of the user • optimize the privacy preserving level  Service provider ‘s problem

  6. Privacy-aware Query Processor Location-based DatabaseServer LBS Middleware Our Assumptions 2: Query + Cloaked Spatial Region 3: Candidate Answer Third trusted party that is responsible on blurring the exact location information. 1: Query + Location Information 4: Candidate Answer

  7. Key problem(1) • Provide a privacy protection level which is suitable to the context • rather than increasing the privacy protection level unconditionally • Each service provider obtain the current privacy concern of the user • optimize the privacy preserving level  push puzzle to the middleware  need an efficient context management and privacy evaluation system

  8. Related Work • An index-based privacy preserving service trigger by Y. Lee, O. Kwon [13]

  9. Output model of privacy concern index 5 3 4 1 2

  10. Privacy Concern Index value is 3.9

  11. Related Work • An index-based privacy preserving service trigger by Y. Lee, O. Kwon [12] • Advantage • Easy implement & good performance • Disadvantage • Static context • Results almost get from users feeling (through survey) • How privacy preserving service can use the result? • Support for privacy algorithms  Need an efficient context management method

  12. Related Work(1) • Efficient profile aggregation and policy evaluation in a middleware for adaptive mobile applications – CARE Middleware – Claudio Bettini [13]

  13. CARE Middleware

  14. Related Work(1) • Efficient profile aggregation and policy evaluation in a middleware for adaptive mobile applications – CARE Middleware – Claudio Bettini [14] • Advantage • Manage context efficiently and dynamically • Results can beuseddirectly for privacy algorithms • Scalability

  15. Design Model System(1)

  16. Evaluation Modulẹ Version 1

  17. Version 2

  18. Outline • Introduction • Overview Attack Model • Classification Defend Model • Evaluation Module • Conclusion

  19. Overview Attack Model[1] • What is a privacy threat? • Whenever an adversary can associate • The identity of a user • Information that the user considers private

  20. Overview Attack Model (1) • What is a privacy attack? • A specific method used by an adversary to obtain the sensitive association • How to classify privacy attacks? • Depending on parameters of an adversary model • What is main components of an adversary model? • Target private information • Ability to obtain the transmit messages • Background knowledge and the inferencing abilities

  21. How adversary model can be used? • The target private information • Explicit in message (i.e. real id) • Eavesdropping channel • Implicit (using pseudo id) • Inference with external knowledge • Ex. Joining pseudo id with location data  Attacks exploiting quasi-identifiers in requests

  22. How adversary model can be used? • Ability to obtain the transmitted messages • Message • Snapshot • Chain • Issuer • Single • Multiple  Single versus multiple-issuer attacks

  23. How adversary model can be used? • The background knowledge and inferencing abilities • Unavailable • Depend on sensitive information in message (implicit or explicit) • Complete available • Privacy violation occurs independently from the service request  Attacks exploiting knowledge of the defense

  24. Outline • Introduction • Overview Attack Model • Classification Defend Model • Evaluation Module • Conclusion

  25. Classification Defend Model • Our Target • Architecture: centralized • Technique: anonymity-based and obfuscation • Defend Model against • Snapshot, Single-Issuer and Def-Unaware Attacks • Snapshot, Single-Issuer and Def-Aware Attacks • Historical, Single-Issuer Attacks • Multiple-Issuer Attacks

  26. Outline • Introduction • Overview Attack Model • Classification Defend Model • Snapshot, Single-Issuer and Def-Unaware Attacks • Snapshot, Single-Issuer and Def-Aware Attacks • Historical, Single-Issuer Attacks • Multiple-Issuer Attacks • Evaluation Module • Conclusion

  27. Single-issuer and def-unaware attack • Some assumptions • Attacker could acquire the knowledge about the exact location of each user • Attacker knows that the generalization region g(r).Sdata always includes point r.Sdata • Attacker can’t reason with more than one request. uniform attack

  28. Single-issuer and def-unaware attack • A uniform attack: Not safe if user require k = 4 (with threshold h = ¼). u2 u1 u3

  29. Single-issuer and def-aware attack • Some assumptions: • Like def-unaware • Attack can know generalization function g  Uniform attack and outlier problem u2 u1 u3

  30. Example attack Outlier problem

  31. Cst+g-unsafe generalization algorithms • Following algorithms: • IntervalCloaking • Nearest Neighbor Cloak • … • Why are they not safe?

  32. Cst+g-unsafe generalization algorithms • These algorithms are not safe: • Every user in anonymizingset (AS) does not generete the same AS for given k • Uniform attack • A property that Cst+g-safe generalization algorithms must satisfy: • AS contains issuer U and at least k-1 additional user • Every user in AS generates the same AS for given k  Reciprocity property

  33. Cst+g-safe generalization algorithms • hilbASR • dichotomicPoints • Grid  All above algorithm satisfy Reciprocity property. So, they are safe with knowledge of generalization function.

  34. Grid algorithm

  35. Centralized defenses against snapshot, single-issuer and def-aware attacks • To defend snapshot, single-issuer and def-aware attack, generalization must satisfy reciprocity property • How to know an algorithm satisfy that property or not?

  36. Decide an algorithm satisfy Reciprocity? • For an request r with k • Run algorithm to get AS • For each id ui in AS, run algorithm to get ASi • If AS = ASi for every i, then algorithm is satisfy reciprocity. • Else, it’s not safe.

  37. Check reciprocity based calculated result • After check reciprocity directly, save result to database • With a new request r, find a similar case • result of previous request of same issuer (if movement is not large) • Result of another request, with: • Same issuer’s location • Same surrounding user’s locations

  38. Case-based module • Run algorithm to generate AS. • Find a similar case in database, return results. • If not found, check reciprocity property • Change k parameter if necessary. • Update result to database • Send result to next step.

  39. Experimental results [9] Spatial Generalization Algorithms for LBS Privacy Preservation

  40. Experimental results [9] Spatial Generalization Algorithms for LBS Privacy Preservation

  41. Chosen algorithms • Architecture • Efficiency • Approach • Security • Following algorithms • Interval cloaking • nnASR • Grid

  42. Chosen algorithms • Architecture • Efficiency • Approach • Security • Following algorithms • Interval cloaking • nnASR • Grid Centralized

  43. Chosen algorithms • Architecture • Efficiency • Approach • Security • Following algorithms • Interval cloaking • nnASR • Grid

  44. Chosen algorithms • Architecture • Efficiency • Approach • Security • Following algorithms • Interval cloaking: predefined regions • nnASR: dynamic regions • Grid: dynamic regions

  45. Chosen algorithms • Architecture • Efficiency • Approach • Security • Following algorithms • Interval cloaking • nnASR • Grid  Def-aware Def-unaware

  46. Outline • Introduction • Overview Attack Model • Classification Defend Model • Snapshot, Single-Issuer and Def-Unaware Attacks • Snapshot, Single-Issuer and Def-Aware Attacks • Historical, Single-Issuer Attacks • Multiple-Issuer Attacks • Evaluation Module • Conclusion

  47. Memorization Property • Definition • Single-Issuer Historical Attacks • Query Tracking Attack • Maximum Movement Boundary Attack • Multiple-Issuers Historical Attacks • Notion of Historical k-Anonymity

  48. D E A C B Memorization PropertyDefinition • k-anonymity property: the spatial cloaking algorithm generates a cloaked area that cover k different users, including the real issuer. Cloaked area contains k users Issuer A r Privacy Middleware Service Provider r’

  49. D E A C B Memorization PropertyDefinition • k users in the cloaked area are easy to move to different places. Attacker which knowledge of exact location of users, has chance to infer the real issuer from the anonymity set. RISK !

  50. Spatial Cloaking Algorithm Processor D E A C B Memorization PropertyDefinition • memorization property[5]: the spatial cloaking algorithm memorizes the movement history of each user and utilize this information when building cloaked area. movement patterns cloaked region

More Related