1 / 110

Cryptography on Non-Trusted Machines

www.dziembowski.net/Slides. Cryptography on Non-Trusted Machines. Stefan Dziembowski. International Workshop on DYnamic Networks: Algorithms and Security September 5, 2009, Wroclaw, Poland. Idea. Design cryptographic protocols that are secure even

dawson
Télécharger la présentation

Cryptography on Non-Trusted Machines

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. www.dziembowski.net/Slides Cryptography on Non-Trusted Machines StefanDziembowski International Workshop on DYnamic Networks: Algorithms and Security September 5, 2009, Wroclaw, Poland

  2. Idea Design cryptographic protocols that are secure even on the machines that are not fully trusted.

  3. How to construct secure digital systems? MACHINE (PC, smartcard, etc.) very secure Security based on well-defined mathematical problems. implementation CRYPTO not secure!

  4. The problem MACHINE (PC, smartcard, etc.) easy to attack implementation hard to attack CRYPTO

  5. Machines cannot be trusted! 1. Informationleakage MACHINE (PC, smartcard, etc.) 2. Maliciousmodifications

  6. Relevant scenarios MACHINES . . . PCs specialized hardware • malicious software • (viruses,trojan horses). side-channel attacks

  7. Examplesofside-channelattacks • timing attack — measuring how much time various computations take to perform, • power monitoring attack — measure the power consumption by the hardware during computation, • attacks based on leakedelectromagnetic radiation, • acoustic cryptanalysis — exploit sound produced during a computation, • differential fault analysis – introduce faults in a computation.

  8. Typeof information that can belearnt • individialbits(probingattacks) • more generalfunctions(e.g. in the Hammingattackthe adversarylearns the sum of secret bits) More on the practicalattacks: Side Channel Cryptanalysis Lounge

  9. The standard view anti-virus software, intrusion detection, tamper resistance,… MACHINE (PC, smartcard, etc.) practitioners Implementation is not our business! definitions, theorems, security reductions,.. CRYPTO theoreticians

  10. Our model (standard) black-box access cryptographicscheme additional accessto the internal data

  11. Plan • Private Circuits • Bounded-Retrieval Model • Entity authentication • Intrusion-Resilient Secret Sharing • Leakage-Resilient Stream Cipher • Open Problems

  12. Private Circuits This part of the lecture is based on [Ishai, Sahai, Wagner: Private Circuits: Securing Hardware against Probing Attacks. CRYPTO 2003] Motivation: Cryptographic hardware can be subject to “probing attacks”.

  13. Probing attacks The adversary can insert needles into the device and read-off the internal values We will model the device as a Boolean circuit.

  14. Randomized Boolean circuits output gates b1 b2 b3 b4 b5 conjunciton gates and and and rnd and neg and neg and “wires” depth random bit gates neg and and and rnd and and negation gates rnd neg input gates a0 a1 a2 a3 a4 a5 a6 a7 size: number of gates

  15. A t-limited adversary Assumption: The adversary can read-off up to twires circuit doesn’t need to be computationally-bounded

  16. An idea for simplicity assume that it is deterministic circuit C’ = T(C) transformation T: circuit C C andC’ should compute the same function. A circuit T(C) should be as secure as C even if the adversary can read-off t wires.

  17. Problem We want to require that “no adversary can get any information about the input a”. C input a Problem: the adversary can always read a directly

  18. Solution output b the adversary cannot read the wires from Iand O I and O should not depend on C output decoder O circuit C input encoder I input a

  19. The model Suppose the adversary reads-off some t wires C output xof the adversary input a Theadversary outputs some value x.

  20. The security definition For every C’ and a for every adversary that attacks C’ there exists a simulator that has no access to C’ and the distribution of the output is the same x x simulator C’ I a

  21. The construction We are now going to construct (T,I,O) We first present the main idea (that contains some errors) Then we repair it. Main tool: secret sharing

  22. m-out-of-nsecret sharing dealer’s secret S (n= 5) S1 S2 S3 S4 S5 • Every set of at least mplayers can reconstructS. • Any set of less than mplayers has no information aboutS.

  23. Secret sharing – more generaly Every secret sharing protocol consists of • a sharing procedure, • a reconstruction procedure, and • a security condition. matching

  24. n-out-of-nsecret sharing This lecture: n-out-of-nsecret sharing Example Suppose S {0,1}. The dealer selects uniformly at random S1,...,Sn{0,1} such that S1+ ...+Sn= S mod 2.

  25. Idea Encode every bit of the input using a m-out-of-m secret sharingfor m = t + 1 example: t = 2 random such that b1+b2+b3 = b mod 2 random such that c1+c2+c3 = c mod 2 random such that a1+a2+a3 = a mod 2 input encoder I a b c decoding - trivial

  26. The transformation and and T neg neg and and and and a b c

  27. How to handle negation? Just negate the first bit... example: t = 4 not a neg neg a

  28. How to handle multiplication? ? c and and a b

  29. How to handle multiplication? Observation:

  30. An idea sharing of a sharing of b Problem: If the adversary can see that ci= 1 then she knows that b = 1 Idea: add randomization...

  31. An improved idea Randomlyflip some entries. We do itsymmetricaly.

  32. xor random

  33. Observation and (a1,a2,a3) and (b1,b2,b3) may not be “independent”. Example: and and a a

  34. Example t = 2 Suppose that the adversary can observe thata3a1 = 1and a3a2=1. Then she knows that a1 = a2 = a3 = 1. So she knows that a1 + a2 + a3 = 1 mod 2. What is the reason? some wires give information about two ai’s

  35. A solution Set m := 2t + 1. In other words: Instead of (t+1)-out-of-(t+1) secret sharing use (2t+1)-out-of-(2t+1) secret sharing

  36. Example: t = 2, m = 5 xor xor

  37. The blow-up The sizeof the circuitisincreasedbyfactor O(t2) The depthof the circuitisincreasedbyfactor O(log d)

  38. A subsequent paper Y. Ishai, M. Prabhakaran, A. Sahai, and D. Wagner. Private Circuits II: Keeping Secrets in Tamperable Circuits. EUROCRYPT 2006 Theycosider the activeattacks, i.e. the adversary can modify the circuit.

  39. Plan • Private Circuits • Bounded-Retrieval Model • Entity authentication • Intrusion-Resilient Secret Sharing • Leakage-Resilient Stream Cipher • Open Problems

  40. Bounded-Retrieval Model This part of the lecture is based on [D. Intrusion-Resilience via the Bounded-Storage Model. TCC 2006] Motivation: PCs can be attacked by viruses

  41. installs a virus retrieves some data The problem Computers can be infected by malware! The virus can: • take control over the machine, • steal some secrets stored on the machine. Can we run any crypto on such machines?

  42. Is there any remedy? If the virus can download all the data stored on the machine then the situation looks hopeless (because he can “clone” the machine). Idea: Assume that he cannot do it!

  43. Bounded-Retrieval Model Make secrets so large that the adversary cannot retrieve them completely. 500 GB ≈ 200$ Practicality?

  44. installs a virus installs a virus retrieves some data retrieves some data The general model no virus no virus no virus The total amount of retrieved data is bounded!

  45. Our goal Try to preserve as much security as possible (assuming the scenario from the previous slide). Of course as long as the virus is controlling the machine nothing can be done. Therefore we care about the periods when the machine is free of viruses.

  46. Two variants How does the virus decide what the retrieve? Variant 1[D06a,D06b,CDDLLW07,DP07,DP08] He can compute whatever he wants on the victim’s machine. Variant 2[CLW06,…] He can only access some individual bits on the victim’s machine (“slow memory”) (a bit similar to the “private circuits”)

  47. the bank We solve the following problem: How can the bank verify the authenticity of the user? Can we implement anything in this model? Yes! E.g.: entity authentication the user

  48. Y (Ry1,…,Rym) Entity authentication – the solution key R = (R1,…,Rt) 00011010011101001001101011100111011111101001110101010101001001010011110000100111111110001010 Y = {y1,…,ym}– a random set of indices in R f(R,Y) := verifies

  49. Security of the authentication protocol Theorem [D06a] The adversary that “retrieved” a constant fraction of R does is not able to impersonate the user. (This of course holds in the periods when the virus is not on the machine.)

  50. What needs to be proven? A Essentially: h that is (sufficiently) “shrinking its input” uniformly random ) Z = f ( , Y R Y h h(R) with an overwhelming probability Z is hard to guess

More Related