1 / 36

I have a DREAM! ( D iffe R entially privat E sm A rt M etering)

I have a DREAM! ( D iffe R entially privat E sm A rt M etering). Gergely Acs and Claude Castelluccia {gergely.acs, claude.castelluccia}@inria.fr INRIA 2011. Smart Metering. Electricity suppliers are deploying smart meters

duaa
Télécharger la présentation

I have a DREAM! ( D iffe R entially privat E sm A rt M etering)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. I have a DREAM! (DiffeRentially privatE smArt Metering) Gergely Acs and Claude Castelluccia {gergely.acs, claude.castelluccia}@inria.fr INRIA 2011

  2. Smart Metering • Electricitysuppliers are deploying smart meters • Devices@homethat report energyconsumptionperiodically (every 10-20-30 minutes). • Shouldimproveenergy management (for suppliers and customers) … • Part of the Smart Grid (Critical Infrastructure)

  3. Privacy?

  4. Privacy? Microwave Kettle Hoover Fridge Lighting

  5. Motivation: Privacy/Security • Potentialthreats • Profiling • Increasein the granular collection, use and disclosure of personalenergy information; • Data linkage of personally identifiable information withenergy use; • Creation of an entirely new "library" of personal information • Security • Is someoneat home? • Wewant to prevent • Suppliersfromprofilingcustomers • Attackersfromgettingprivate information

  6. Contributions • First provably private scheme for smart metering • No need for trusted aggregator • No assumptions about the adversary’s power (knowledge) • Remains useful for the supplier • Robust against node failures!! • Secure against colluding malicious users • Validated by simulations • a new simulator to generate synthetic consumption data

  7. Overview • Model • Adversary model • Network model • Privacy model • Our scheme: Distributed aggregation with encryption • Performance and privacy analysis • Conlusions

  8. Model • Dishonest-but-non-intrusive adversary • does not follow the protocol correctly • collude with malicious users • BUT: cannot access the distribution network (like to install wiretapping devices) • Network model • No communication between meters! • Each meter has a public/private key pair • Privacy model • Differential privacy model

  9. Why Differential Privacy? • There are different possible models (k-anonymity, l-diversity, …) • We are using the Differential Privacy model • Only model that does not make any assumptions about the attacker model • Proposes a simple off-the-shelf sanitization technique • Strong (too strong?) and provides provable privacy!

  10. The DifferentialPrivacy Model • Informally, a sanitization algorithm A is differentially private if its output is insensitive to changes in any individual value • Definition: A is ε-differential private if given 2 datasets (set of traces) I and I’ differing in only one user, and any output x, then: • First model that provides provable privacy! • …and make no assumptions about the adversary! • Very strong (too strong?)

  11. Sanitization • It was shown that a simple solution is to add noise to each sample in each slot such that: • It can be shown that if: • noise follows a Laplacian distribution • where is the scale parameter of the laplace distribution, and Δ is the sensitivity (i.e. maximum value a sample can take) Then is ε-private in each slot

  12. Sanitization: Example (sum over 4 slots) (over 4 slots)

  13. Aggregating Data Electricity Supplier • Supplier gets (noisy) aggregated value but can’t recover individual sample! Aggregator

  14. Error/utility • The larger the cluster, the better the utility…but the smaller the granularity

  15. Noised Aggregated Data: Sum of N samples + Lapl. noise • N=200 • N=600

  16. Aggregating DataPros/Cons • Pros: • Great solution to reduce noise/error • … and still generate useful (aggregated) data to the supplier • …with strict privacyguarantees. • Cons: • Aggregators have to betrusted! • Whocanbe the aggregator? Supplier? Network? Can weget ride of the aggregator and stillperformaggregation??

  17. Distributed Aggregation Electricity Supplier

  18. Our Approach: DistributedAggregation • Step 1: Distributed noise generation • We use the factthat a Laplacian noise canbegenerated as a sum of Gamma noises • Eachnodeadds to itssample and sendsresult to the supplier • Whennoisedsamples are aggregated by the supplier, the noise getsadded to a Laplacian noise… • No more aggregatorneeded!

  19. Problem: original data: gamma noised data: • The added gamma noise is too small to guarantee privacyof individual measurements! • The supplier can possibly retrieve sample value from noised samples!

  20. Step 2: Encrypting noised samples Electricity Supplier

  21. Performance and privacy analysis • A new trace generator • Error depending on the number of users • Privacy over multiple slots • Privacy of appliance usages and different activities (cooking, watching TV, …) • Privacy of being home

  22. Trace generation

  23. Error and the number of users ε over a single slot!

  24. Privacy of appliances Noise is added to guarantee ε=1 per slot = error is 0.17 with 100 users

  25. Privacy of the simultanous usage of active appliances (Are you at home?) • ε 0.17 error for 100 users (ε=1 per slot)

  26. Privacy of the simultanous usage of all appliances • ε 0.17 error for 100 users (ε=1 per slot)

  27. Conclusion • First practical scheme that provides formal privacy and utility guarantees… • Our scheme uses aggregation + noise • Validation based on realistic datasets (generated by simulator) • We can guarantee meaningful privacy for some activities (or appliances) but cannot hide everything! • Privacy can be increased by adding more noise but we have to add more users to ensure low error!

  28. Encryption • Modulo-addition based:where • ki is not known to the supplierwhere

  29. Key generation • Each node pair shares a symmetric key • Each node randomly picks x other nodes such that if v selects w then w also selects v. Example for two nodes: • v selects w (and w selects v) if: • v and w generate the encryption key: • v supplier: • w supplier: • Supplier decrypts by adding the ciphertexts:

  30. Security analysis • misbehaving users: • supplier can deploy fake meters (α fraction of N nodes) or some users collude with the supplier and omit adding noise • each user adds extra noise to tolerate this attack… • supplier lies about the cluster size • … • see report for proofs/details

  31. Error and the number of misbehaving users (ε=1 per slot)

  32. Why aggregation is not enough? • Why noise has to beadded? • Because we don’t make any assumption about the adversary model…. • E.g., if he knows (N-1) values, it can get the N th value… even with aggregation and encryption • But can’t get any info about Nth value if noise is added ;-) • Verystrongguarantee!

  33. Laplace Distribution

  34. Privacy over multiple slots • Composition property of diff. privacy:If we have ε1 and ε2 privacy in two different slots, then we have ε1+ε2 privacy over the two slots • Note ε=1 is an upper bound (for all users) in each slot! The exact bound by adding if we have consumption c(t) • Over multiple slots:

  35. Example

  36. Differential Privacy Model: interpretation • If ε = 1: • If ε = 0.5: • If ε = 0.1: I or I’ Was the input I or I’ ??? Similar idea than indistinguishability in crypto….

More Related