1 / 31

ECON 425/563 // CPSC 455/555 NOVEMBER 6, 2008

Online Privacy. ECON 425/563 // CPSC 455/555 NOVEMBER 6, 2008. Outline. Large amounts of sensitive information flow around the web. Privacy-enhancing technology has been developed and deployed (example: P3P). Economic approaches to the management of private information

liona
Télécharger la présentation

ECON 425/563 // CPSC 455/555 NOVEMBER 6, 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Privacy ECON 425/563 // CPSC 455/555 NOVEMBER 6, 2008

  2. Outline • Large amounts of sensitive information flow around the web. • Privacy-enhancing technology has been developed and deployed (example: P3P). • Economic approaches to the management of private information (Acknowledgements: L. Cranor, C. Lu, and H. Varian)

  3. Online privacy in the comics! February 25, 2000 Cathy

  4. Why is Cathy concerned? Cathy March 1, 2000

  5. How did Irving find this out? • He snooped her email • He looked at the files on her computer • He observed the “chatter” sent by her browser • He set cookies through banner ads and “web bugs” that allowed him to track her activities across web sites

  6. Browsers chatter about IP address, domain name, organization, Referring page Platform: O/S, browser What information is requested URLs and search terms Cookies To anyone who might be listening End servers System administrators Internet Service Providers Other third parties Advertising networks Anyone who might subpoena log files later What do browsers chatter about?

  7. A typical HTTP request GET /retail/searchresults.asp?qu=beer HTTP/1.0 Referer: http://www.us.buy.com/default.asp User-Agent: Mozilla/4.75 [en] (X11; U; NetBSD 1.5_ALPHA i386) Host: www.us.buy.com Accept: image/gif, image/jpeg, image/pjpeg, */* Accept-Language:en Cookie:buycountry=us; dcLocName=Basket; dcCatID=6773; dcLocID=6773; dcAd=buybasket; loc=; parentLocName=Basket; parentLoc=6773; ShopperManager%2F=ShopperManager%2F=66FUQULL0QBT8MMTVSC5MMNKBJFWDVH7; Store=107; Category=0

  8. What about cookies? • Cookies can be useful • used like a staple to attach multiple parts of a form together • used to identify you when you return to a web site so you don’t have to remember a password • used to help web sites understand how people use them • Cookies can do unexpected things • used to profile users and track their activities, especially across web sites

  9. How do cookies work? • A cookie stores a small string of characters • A web site asks your browser to “set” a cookie • Whenever you return to that site your browser sends the cookie back automatically • Cookies are only sent back to the site that set them Please store cookie xyzzy Here is cookie xyzzy site browser site browser First visit to site Later visits

  10. Search formedicalinformation Buy book Setcookie Readcookie Searchengine Book Store Ad Ad YOU Ad companycan get yourname and address frombook order andlink them to your search

  11. Web bugs • Invisible “images” embedded in web pages that cause cookies to be transferred • Work just like banner ads from ad networks, but you can’t see them unless you look at the code behind a web page • Also embedded in HTML formatted email messages For more info on web bugs see:http://www.privacyfoundation.org/education/webbug.html

  12. Referer log problems • GET methods result in values in URL • These URLs are sent in the referer header to next host • Example: http://www.merchant.com/cgi_bin/order?name=Tom+Jones&address=here+there&credit+card=234876923234&PIN=1234& -> index.html

  13. A Technological Approach The Platform for Privacy Preferences (P3P) is a standard, computer-readable format for privacy policies and a protocol allowing web browsers and other tools to read and process privacy policies automatically.

  14. Who created P3P? • World Wide Web Consortium (W3C) – a nonprofit, industry-supported consortium including researchers and engineers from over 420 institutions. • Participants in the development of P3P came from around the world, including representatives from industry, government, nonprofit organizations, and academia.

  15. Why was P3P created? • To increase consumer trust. “If the ability to spend is the fuel that propels the economic engine, then consumers’ trust and confidence in that engine is the lubricant.” • To protect privacy by allowing informed choice. Privacy is the ability of individuals to exercise control over the disclosure and subsequent uses of their personal information. Hence notice is fundamental to the individual’s ability to protect his or her privacy. • To make choice easy. Privacy policies are difficult and time-consuming to locate, to read, and to understand; and they change frequently without notice.

  16. How does P3P work? (1) • User sets personal privacy preferences on a tool such as a browser.

  17. How does P3P work? (2) 2. Browser requests privacy policy from a (P3P-compliant) Web site. 3. Browser compares the privacy policy with the user’s privacy preferences and acts accordingly. (Symbols, pop-up prompts, etc.)

  18. P3P Policy Elements Include: • Who is collecting these data? • What information is being collected? • For what purpose? • Which information is being shared with others? • Who are these data recipients? • Can users access their identified data? • Can users make changes in how their data is used? • What is the policy for retaining data? • How are disputes resolved?

  19. Purpose Specifications: • Completion and support of activity for which data was provided • Web site and system administration • Research and development • One-time tailoring • Pseudonymous decision or analysis • Individual decision or analysis • Contacting visitors for marketing of services or products • Historical preservation • Contacting visitors for marketing of services or products via telephone • Other purpose

  20. What P3P Accomplishes • Makes privacy notices easy to locate and easy to understand. • Allows users to specify their privacy preferences once so that they can be automatically compared to a web site’s privacy policy. • Assists users in making decisions about when to disclose personal information, how much, and to whom.

  21. What P3P Does NOT Accomplish • Does NOT replace privacy regulations. • Can NOT protect the privacy of users in jurisdictions with insufficient data privacy laws. • Can NOT ensure the companies or organizations follow their stated privacy policies. “P3P does not protect privacy, in and of itself. It does, however, help create a framework for informed choice on the part of consumers. Any efficacy that P3P has is dependent upon the substantive privacy rules established through other processes – be they a result of regulatory, self-regulatory, or public pressure.”

  22. Controversy over P3P “In the context of proper legislation, P3P is the most promising solution to cyberspace privacy. It will make it easy for companies to explain their practices in a form that computers can read, and make it easy for consumers to express their preferences in a way that computers will automatically respect.” – Professor Lawrence Lessig, Stanford Law School.

  23. Controversy over P3P P3P is: a) Pretty Poor Privacy, b) a Pretext for Privacy Procrastination, and c) “a tacit acceptance of the great increase in the tracking and monitoring of our minor activities that take place over the Web.” – Karen Coyle, Information Technology Specialist, University of California

  24. Support for P3P • Provides notice and consent • Promotes transparency and accountability • Intuitive • Flexible and global • Worthwhile process

  25. Criticism of P3P • Lack of enforcement • Used as a procrastination tool • Unclear legal consequences • Importance of default settings • Unable to maintain current experience • Expensive to implement and maintain • Overly broad and vague purpose specifications • Ultimatum-style communication

  26. More Criticism of P3P • Consumer and business confusion • Rejected by the European Union • Lack of actual choice • Assumes the need to gather information • Does not address third-party data collection • Lack of control over an irreversible choice

  27. Basic Conflict What is the real problem? Lack of knowledge about how information will be used? OR The gathering of the data itself?

  28. Universal Agreement Enforcement mechanisms are needed. “A technical platform for privacy protection…must be applied within the context of a framework of enforceable data protection rules, which provide a minimum and non-negotiable level of privacy protection for all individuals. Use of P3P in the absence of such a framework risks shifting the onus primarily onto the individual user to protect himself” – European Commission, 1998.

  29. Economic View of Privacy • Non-adoption of P3P and other privacy-enhancing technologies is not due to technological flaws. It is due to economic incentives. • Rational consumers want some of their personal information available to producers. They will experience more privacy (e.g., less intrusive marketing) and reduced search costs if their true preferences are known.

  30. Complication: Secondary Use • Customers can benefit from collection and analysis of personal information by merchants with whom they transact directly. • If that information is sold to a third party that does not know the customers, that third party will use it more clumsily and reimpose cost on the customers. • The seller of this information has externalized these costs.

  31. Privacy as a Property Right • Data collectors can externalize the costs of secondary use because current law gives them property rights in the databases they construct. • Alternative: Vest the property rights in the data subjects, and compensate them for use of their data. • Varian gives examples of how to structure information markets and set prices: http://people.ischool.berkeley.edu/~hal/Papers/privacy • Opponents of this approach would rather ban sale of personal information altogether and establish a true “right to privacy.” Relying on property rights to control the dissemination and use of personal information ensures that only the rich will have privacy.

More Related