1 / 49

Privacy Technology

Privacy Technology. Analysis and Mechanisms. David Chaum. Privacy is fundamentally important!!!. Is essential for democracy Needed for participation without fear of retribution Is a fundamental human right. Analysis Policy Economic Solution Mechanisms Legal Technological

kasie
Télécharger la présentation

Privacy Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy Technology Analysis and Mechanisms David Chaum

  2. Privacy is fundamentallyimportant!!! • Is essential for democracy • Needed for participation without fear of retribution • Is a fundamental human right

  3. Analysis Policy Economic Solution Mechanisms Legal Technological “Privacy Technology” OUTLINE

  4. Policy Analysis The actors and macro considerations

  5. Hierarchy of IT Needs of Humans • Self-Worth—relation to: artificial intelligence, etc. • Privacy—identity, credential & role protection • Interaction—communication, exploration, commerce • Security—uptime, robustness, no hacking • Processing—storage, interface, crunching Maslow’s Hierarchy of Needs

  6. Policy Issues

  7. Economic Analysis These days, everybody’s an economist!

  8. Monetizing privacy • Various schemes proposed (even 20+ years ago) • Consumers pay for privacy protection services • Consumers are paid for use of their privacy-related data • A brokerage of privacy related data

  9. Imbalance in desire for privacy/data • Individuals discount present value of privacy protection in transactions • Explains anomalous behavior of consumers when confronted with cost or inconvenience • Practices and potential dangers unknown • Organizations value personal data • Overestimate future potential of data • Discount exposure to organization • An organization not too concerned about dangers posed to consumers that it is not accountable for

  10. Imbalance in size/power of entities • Organizations have lots of leverage • Their are few sources of mass products and services • Consumers don’t have much choice for many products or services • High relative cost of change of practices for consumers

  11. Legal mechanisms Powerful but don’t work well directly

  12. Legal mechanisms—evolution • Originally based on codifying legitimate expectation of privacy • People should be able to review and amend data • No erosion of privacy due to technology • Best privacy protection practical

  13. Legal mechanisms—capabilities • Accountability after the fact is ineffective • Hardly able to address • Covert/clandestine abuse • Abuse of public or leaked data • Corporate shield • Undoing damage done to people • Can cause creation and use of infrastructure

  14. Technological Mechanisms The directly-effective mechanism

  15. Locus of privacy-related control—The critical architectural choice Organization x infomediary

  16. Locus of control—Three choices: • At organizations • Weak benefit/effect for consumers • Clandestine abuse, leaks, reversibility… • Mollify/diffuse the issue – prevent effective solutions • At an intermediary • Create infrastructure with single point of failure • Full cost but little true benefit • Dangerous concentration • At the individual • Privacy technology – the only good solution

  17. Old paradigm—assumptions/model proven false! • Believed to be a zero-sum game, privacy v. security • ID believed needed for security against abuse by individuals • ID believed only way to organize data

  18. Old Paradigm

  19. New paradigm • Individuals provide organizations with minimum sufficient information and proof of its correctness

  20. Privacy Technology Win-Win break of the believed tradeoff

  21. New Paradigm

  22. Feasibility of a comprehensive solution set has been proven • Payments—eCash payments deployed by major banks on 4 continents • Communication—Mix nets, onion routing, etc. have been widely deployed • Credentials—mechanisms implemented on cards and by IBM

  23. Benefits to organizations (micro) • Reduced exposure/liability • Better data • Cleaner because less deception and garbage • More willingness to provide data because of protections • All organizations get the data; level playing field • Better public image (?) – probably wrong!

  24. Not easy to get there from here • Requires lots of users (hard to be anonymous alone!) • Difficult to get the system “primed” • Consumers don’t want to pay costs • Organizations tend to resist change

  25. Really an “infrastructure issue” • Pseudonymity / Anonymity only “in numbers” (as mentioned) • Communication infrastructure can nullify protections • Way to share data pseudonymously is infrastructure

  26. CONCLUSION A “Privacy Technology” infrastructure is the way to go and would be hugely beneficial

  27. privacy / consumer-control Kinds of Privacy for Payments Organization-controlled privacy Consumer- controlled privacy No privacy False privacy Protection only from merchant credit cards on the Internet eCash™ Buy/reload card without identification Advertise consumer privacy stored-value cards technology / time pre-paid phone cards Government payments, e.g. transfer-order systems bank notes & coins

  28. Consumer Payments Market Space scheduledpayments irregularpayments low value high value $10

  29. You can buy a digital “bearer” instrument from a bank with funds in your account You can pay by giving the instrument to the payee, who deposits to an account Electronic Cash

  30. zoom in on eCash blinding

  31. Privacy and Control over Payments • Nobody can learn without your cooperationwho you pay, how much you pay, or when • You can always prove who received any payment, for how much, and when • Payments can only be made by you and they cannot be stopped by others

  32. You deal with each organization under a distinct “digital pseudonym”—a public key whose corresponding private key only you know You obtain a “credential” as a digital signature formed on one of your digital pseudonyms You answer the queries you choose to by proving you have sufficient credentials Credential Mechanisms

  33. A tamper-resistant chip, issued by a trusted authority, is carried by the individual But the chip can only talk to the outside world through the person’s PC/PDA The two devices perform a multiparty computation and thus speak to the outside world with a common voice Wallet with Observer

  34. message 1 message 2 message 3 message 4 How untraceable-sending works Mix network The “mix” sever decrypts and re-orders inputs

  35. ? Prevents tracing messages back message 2

  36. Cascade of three Mixes PK3 PK1 PK2 Server 3 Server 2 Server 1

  37. Encryption of message PK3 PK1 PK2 message Ciphertext = EPK1[EPK2[EPK3[message]]]

  38. m1 decrypt and permute m2 decrypt and permute m2 m2 decrypt and permute m3 m2 m3 m1 m1 m1 m3 m3 Processing the messages Server 1 Server 2 Server 3

  39. ? Tracing prevented by any mix Server 1 Server 3 Server 2 m3

  40. The Information Awareness Office (IAO) develops and demonstrates information technologies and systems to counter asymmetric threats by achieving total information awareness useful for preemption, national security warning and national security decision-making. John Poindexter, national security adviser to former President Reagan, is the director of the new agency. He was a controversial figure both for his role in the Iran-contra scandals and for his efforts to assert military influence over commercial computer security technologies. NSDD 145 & Data Mining. IAO

More Related