1 / 28

COPS: Community-Oriented Privacy System

COPS: Community-Oriented Privacy System. The Email Prototype. COPS. Introduction Motivation Email Prototype Future work. Individual Privacy. What is privacy? The quality or state of being apart from company or observation How do we protect privacy? Doors, locks, alarms, fences/gates

hilda
Télécharger la présentation

COPS: Community-Oriented Privacy System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COPS: Community-Oriented Privacy System The Email Prototype

  2. COPS • Introduction • Motivation • Email Prototype • Future work

  3. Individual Privacy • What is privacy? • The quality or state of being apart from company or observation • How do we protect privacy? • Doors, locks, alarms, fences/gates • Passwords, encryption, information flow • Whose privacy is being protected? • The individual

  4. Group Privacy • What is a ‘group’? • Collaborative, Cooperative, Collective • What is privacy? • A boundary regulation process that is context dependent • How is privacy protected? • A dynamic process driven by a group of users

  5. Communication • How do groups of people share information? • In person • Postal mail • Telephone • Email/Fax • Social networking • Cloud

  6. Motivation • Facebook • News Feed (2006) • Facebook made privacy changes that made status updates, images, and other user-created content public by default, which motivated more than a third of Facebook users to alter their privacy settings. • Beacon (2007-2009) • Beacon was a part of Facebook's advertisement system that sent data from external websites to Facebook, for the purpose of allowing targeted advertisements and allowing users to share their activities with their friends. • Cookies (Oct. 2011) • Facebook sued for “tracking, collecting, and storing its users’ wire or electronic communications, including but not limited to portions of their internet browsing history even when the users were not logged-in to Facebook.”

  7. Motivation • Cloud privacy concerns • Amazon’s cloud browser • "All of your web surfing habits will transit Amazon's cloud. If you think that Google AdWords and Facebook are watching you, this service is guaranteed to have a record of everything you do on the Web.“ -Chester Wisniewski, a senior security adviser at British computer security firm Sophos • Dropbox • “Insecure by design” –Derek Newton of Information Security Insights • Will turn your files over to the Government if asked

  8. Motivation • Have you ever sent an email to the wrong person accidentally? • “One of Eli Lilly & Co.'s sub-contracted lawyers at Philadelphia based Pepper Hamilton had mistakenly emailed confidential Eli Lilly's discussions to Times reporter Alex Berenson (instead of Bradford Berenson, her co-counsel), costing Eli Lilly nearly $1 billion.” (Zilberman 2010) • Other issues: Huge recipient lists, similar/duplicate names

  9. Email Leakage • Are there any solutions out there to stop this from happening? • Gmail undo: • Complex privacy policies (Leon 2011) • “Online opt-out tools were challenging for users to understand and configure”

  10. COPS • Community-Oriented Privacy System (COPS) • Privacy boundaries are defined through "community tags“ • Regulation of privacy is provided through mechanisms for setting, changing, and making exceptions to the community tags. • Sense of community is realized through mechanisms for notification (making actions of individuals visible to the group) and consensus (allowing the group to vote on changes and exceptions)

  11. COPS • Why take a community-based approach? • Shared expertise - privacy-enhancing features like rules and settings are underutilized by the individual. Shared expertise will lead to better utilization of the privacy mechanisms and better awareness of the privacy requirements and privacy threats. • Shared responsibility - social pressure from the group encourages the individual to pay better attention to neglected privacy tasks due to a sense of responsibility to the community.

  12. COPS • Community Tag • Protect privacy by requiring group consensus required for tag creation/modification and exceptions • Tag usage is enforced by the users

  13. COPS • Threat model - In computer security the term threat modeling is used to describe a set of issues that the designer of the system is interested in. In order to achieve a feeling of privacy we must cover four areas in particular: • Accidental disclosure • Lack of awareness • Inability to understand the privacy rules or system interfaces • Inability of the system to guarantee desired privacy

  14. COPS • Accidental disclosure - when a user accidentally sends an email to the wrong recipient or incorrectly forwards an email to someone that should not have seen it. We anticipate that most privacy breaches are the result of accidental disclosure and we've seen that accidental disclosure can have significant ramifications

  15. COPS • Lack of awareness - Related to accidental disclosure but mostly caused by the lack of knowledge rather than being a mere mistake. Lack of awareness privacy breaches most likely occur as a result of email forwarding and long recipient lists. Tracking an email's forwarding history can be a daunting task and trying to read through an email's recipient list of more than a dozen people is an exercise of frustration

  16. COPS • Inability to understand the privacy rules or system interfaces - As the designers of this system, we're responsible for providing an interface that's easy to use and understand. Our research should look into the metaphors and current practices that people use when protecting privacy and leverage these practices

  17. COPS • Inability of the system to guarantee desired privacy - Our research much span across various different domains -- from academia to the business world and from small informal groups to large group communication. Different domains have different requirements and enforce privacy in different ways. We must take into account these different practices and design a system that can translate easily from one domain to another

  18. COPS • Email Prototype • Proof of concept • Mockups • High fidelity prototype

  19. COPS

  20. COPS

  21. COPS

  22. COPS

  23. COPS

  24. COPS • High fidelity prototype

  25. COPS • Usability Studies • Single user observations • Interactive task-based experiment • In the experiment the user will take the role of an employee working in the Human Resources department at a fictitious company. • Tasks will start out easy and straightforward and will ramp up in difficulty as the participant works through the problems.

  26. COPS • Evaluation • Gather data from intro/exit questionnaires, post-experiment interviews, observation, system monitoring • Measure performance through ease of use, task completion time, error rate, and user satisfaction • Most importantly, do users understand the new privacy features and do they find them useful in the context of an Email application?

  27. COPS • Future work • Group of users interacting with the system at once • Writing a plug-in for existing Email applications • File system/cloud implementation

More Related