1 / 13

Offline Auditing for Privacy

Offline Auditing for Privacy . Jeff Dwoskin, Bill Horne, Tomas Sander Trusted Systems Laboratory Princeton. Why Auditing for Privacy? . Potential advantages Collect and analyze log data to detect privacy violations offline May also work where enforcement doesn’t

alva
Télécharger la présentation

Offline Auditing for Privacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Offline Auditing for Privacy Jeff Dwoskin, Bill Horne, Tomas Sander Trusted Systems Laboratory Princeton

  2. Why Auditing for Privacy? Potential advantages • Collect and analyze log data to detect privacy violations offline • May also work where enforcement doesn’t • Create trail of what happened to privacy sensitive data for • Documentation • Forensics • Demonstrate compliance with internal privacy policy • Watch the watchers

  3. Two challenges • How can we audit for the benefit of privacy? • Privacy violation detection system functionality • Compliance functionality • How can auditing itself be performed in a privacy-friendly and secure way. • Integrity • Encrypted storage • Pseudonymization and anonymization of audit file data • Etc.

  4. What can we collect? • Data access • User, Application, Time, Data record accessed • Source • E.g. machine the request came from, internal/external etc. • Part of the data record itself • E.g. age of data record subject • Consent information present • Opt in, opt out • Privacy sensitive activities • Deletion of records • Consequences • e.g. alert issued, where enforcement inappropriate

  5. How can we analyze collected data? • Against simple privacy policy rules • (e.g., expressed in languages like EPAL) • Have counters and collect statistics about behaviors that might be suspicious. • Organize them into reports. • Hope: • Offline auditing can be more sophisticated due to lack of real-time requirements.

  6. What does HIPAA say about auditing? We propose that audit control mechanisms be put in place to record and examine system activity. We adopt this requirement in the final rule.

  7. Create events creation of records that contain PHI import of records that contain PHI Modify events editing of data re-association of data de-identifying of PHI View events access to PHI by any user export of PHI to digital media or network print or FAX of PHI Delete events user command to delete PHI automated command to delete PHI Non-PHI events user login & logout changes to user accounts detection of a virus network link failures changes to network security configuration etc.. How is this interpreted?

  8. What kinds of things might you look for? • access to PHI by anyone not directly related to the patients treatment, payment of healthcare operation • access to information not corresponding to the role of the user • access to PHI of VIPs or community figures • access to records that have not been accessed in a long time • access to PHI of an employee • access to PHI or a terminated employee • access to sensitive records such as psychiatric records • access to PHI of minors • data recorded without a corresponding order

  9. Pseudonymization • Work by Flegel: • Audit data is intercepted by a local pseudonymiser and then forwarded by syslog to remote hosts or stored • Pseudonymiser substitutes (predefined) identifying features (types of identifying info) by shares, generated via Shamir’s secret sharing scheme. • Record encrypted under key K. K can be reconstructed if at least k shares are found.

  10. Further work on pseudonymization • Anonymouse log file anonymiser: analysis possible, but anonymised data cannot be recovered • Privacy enhanced IDS supports the recovery of pseudonymised info e.g. IDA, AID

  11. Searching encrypted log data • Ex: public key based solutions: • IBE based solutions • Waters, Balfanz, Durfee, Smetters • Boneh, Crescenszo, Ostrovsky,Persiano • Idea: • In Identity Based Encryption (IBE) every string can be used as a public key for encryption • Corresponding decryption key supplied by key distribution center (KDC)

  12. Searching Encrypted Log Files II • Encryption: • For each document m choose random sym. key K and encrypt m under K • For keywords w1,….wl in m encrypt (FLAG, K) with public keys w1...wl. • Store results c1, …cl with encrypted document. • Keyword search: • For keyword w investigator request private key corresponding to w from KDC • For each doc m investigator attempts decryption of c1…cl • If FLAG is found, doc contains w and K is found.

More Related