cryptography on non trusted machines n.
Skip this Video
Loading SlideShow in 5 Seconds..
Cryptography on non-trusted machines PowerPoint Presentation
Download Presentation
Cryptography on non-trusted machines

Cryptography on non-trusted machines

152 Vues Download Presentation
Télécharger la présentation

Cryptography on non-trusted machines

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Cryptography on non-trusted machines StefanDziembowski UniversityofRome La Sapienza

  2. General plan • December 2008:Introduction to provably-secure symmetric cryptography • January 2009:Cryptography on non-trusted machines

  3. Introduction to provably-secure symmetric cryptography StefanDziembowski UniversityofRome La Sapienza

  4. Cryptography In the past:the artofencryptingmessages (mostlyfor the militaryapplications). Now:the scienceofsecuringdigitalcommunication and transactions (encryption, authentication, digitalsignatures, e-cash, auctions, etc..)

  5. Plan • Encryption schemes • information-theoretically secure • computationally-secure;basic tools: • pseudorandom generators • one-way functions • pseudorandom functions and permutations • Message authentication schemes • information-theoretically secure • computationally-secure • Hash functions (collision-resistance and the random oracle model). (we’ll talk only about the symmetric-key cryptography)

  6. plaintext m Encryption schemes(a very general picture) Encryption scheme (cipher) = encryption & decryption encryption ciphertext c decryption m In the past:a text in natural language. Now: a string of bits. should not learn m

  7. Art vs. science In the past: lack of precise definitions, ad-hoc design, usually insecure. Nowadays: formal definitions, systematic design, very secure constructions.

  8. Provable security But... We want to construct schemes that are provably secure. • why do we want to do it? • howto define it? • and is it possibleto achieve it?

  9. Provable security – the motivation In many areas of computer science formal proofs are not essential. For example, instead of proving that an algorithm is efficient, we can just simulate it on a “typicalinput”. Why? Because a notion of a “typicaladversary” does not make sense. In cryptography it’s not true, because there cannot exist an experimental proof that a scheme is secure.

  10. Provable security – a tradeoff strong proofs and not efficient no proofs and very efficient good news • There exist schemes that • are very efficient • have some provable properties

  11. A typical strategy for a theoretician take a scheme that is used in real life then: prove that it is secure, or propose an equally efficient scheme that is provably secure This works especially if the real-life schemes have some security weaknesses. Examples of success stories:HMAC, OAEP...

  12. Kerckhoffs' principle Auguste Kerckhoffs (1883): The enemy knows the system The cipher should remain secure even if the adversary knows the specification of the cipher. The only thing that is secret is a short keyk that is usually chosen uniformly at random

  13. Let us assume thatk is unifromly random A more refined picture plaintext m encryption ciphertext c decryption m key k key k (Of course Bob can use the same method to send messages to Alice.) (That’s why it’s called the symmetric setting)

  14. Kerckhoffs' principle – the motivation • In commercial productsit is unrealistic to assume that the design details remain secret (reverse-engineering!) • Short keys are easier to protect, generate and replaced. • The design details can be discussed and analyzed in public. Not respecting this principle = ``security by obscurity”.

  15. A mathematical view K– key space M– plaintext space C - ciphertext space An encryption schemeis a pair (Enc,Dec), where • Enc : K × M → C is an encryptionalgorithm, • Dec :K × C→ Mis an decryptionalgorithm. We will sometimes write Enck(m) and Deck(m) instead of Enc(k,m) and Dec(k,m). Correctness for every k we should have Deck(Enck(m)) = m.

  16. Historical ciphers • shift cipher, • substitution cipher, • Vigenère cipher, • Enigma, • ... all broken...

  17. Defining “security of an encryption scheme” isnot trivial. (m – a message) • the key Kis chosen randomly • C := EncK(m)is given to theadversary consider the following experiment Doesn’t need to be uniform,but at least has to be “fresh”(i.e. sampled independently of m). how to define security ?

  18. Idea 1 (m – a message) • the key K is chosen randomly • C := EncK(m)is given to the adversary An idea “The adversary should not be able to compute K.” A problem the encryption scheme that “doesn’t encrypt”: EncK(m) = m satisfies this definition!

  19. Idea 2 (m – a message) • the key K is chosen randomly • C := EncK(m)is given to the adversary An idea “The adversary should not be able to compute m.” A problem What if the adversary can compute, e.g., the first half of m?

  20. Idea 3 (m – a message) • the key K is chosen randomly • c := Enck(m)is given to the adversary An idea “The adversary should not learn any information about m.” A problem But he may already have some a priori information about m! For example he may know that mis a sentence in English...

  21. Idea 4 (m – a message) • the key K is chosen randomly • C := EncK(m)is given to the adversary An idea “The adversary should not learn any additional information about m.” This makes much more sense. But how to formalize it?

  22. We will use the language of the probability theory. Notation A - a random variable X – an event then PA denotes the distribution of A: PA(a) = P(A = a) and PA|Xdenotes the distributionofAconditioned on X: PA|X(a) = P (A = a | X).

  23. More notation Two (discrete) random variables A and B are independent if for every a and b: P(A = a and B = b) = P(A = a) P(B = b).

  24. How to formalize the “Idea 4”? “The adversary should not learn any additional information about m.” An encryption scheme is perfectly secretif for every random variable M and every c ЄC PM = PM | (Enc(K,M))= c also called: information-theoretically secret such that P(C = c) > 0 equivalently:M and Enc(K,M) are independent

  25. Equivalently: for every M we have that:M and Enc(K,M) are independent for every M and m we have:P Enc(K,M) = P (Enc(K,M)) | M = m for every M and m we have:P Enc(K,M) = P Enc(K,m) for every m0and m1we have P Enc(K,m0) = P Enc(K,m1) intuitive...

  26. A perfectly secret scheme:one-time pad t– a parameter K = M = {0,1}t component-wise xor Gilbert Vernam(1890 –1960) Vernam’s cipher: Enck(m) = k xor m Deck(c) = k xor c Correctness is trivial:

  27. Perfect secrecy of the one-time pad Perfect secrecy of the one time pad is also trivial. This is because for every m the distribution PEnc(K,m) is uniform (and hence does not depend on m).

  28. Observation One time pad can be generalized as follows. Let (G,+)be a group. Let K = M = C =G. The following is a perfectly secret encryption scheme: • Enc(k,m) = m + k • Dec(k,m) = m – k

  29. Why the one-time pad is not practical? • The key has to be as long as the message. • The key cannot be reused This is because:

  30. One time-pad is optimal in the class of perfectly secret schemes • Theorem (Shannon 1949) • In every perfectly secret encryption scheme • Enc : K × M → C , Dec :K × C→ M • we have |K|≥|M|. Proof Perfect secrecyimplies that the distribution of Enc(K,m) does not depend on m Wlog suppose that Ccontains only the ciphertexts that have a non-zero probability in Enc(K,m). Hence: |K|≥|C|. |K|≥|M| Fact: we always have that |C|≥|M|. This is because for every k we have that Enck : M → C is an injection (otherwise we wouldn’t be able to decrypt).

  31. Outlook We constructed a perfectly secret encryption scheme Our scheme has certain drawbacks(|K|≥|M|). But by Shannon’s theorem this is unavoidable. maybe the secrecy definition is too strong? Can we go home and relax?

  32. What to do? Idea limit the power of the adversary. How? Classical (computationally-secure) cryptography: bound his computational power. Alternative options: quantum cryptography, bounded-storage model,... (not too practical)

  33. How to bound the computational power of the adversary? It is enough to require that M and EncK(M) are independent “from the point of view of a computationally-limited adversary’’. We required that M and EncK(M) are independent, How can this be formalized? We will use the complexity theory!

  34. Realcryptography starts here: Eve is computationally-bounded But what does it mean? Ideas: • “She has can use at most 1000 Intel Core 2 Extreme X6800 Dual Core Processors for at most 100 years...” • “She can buy equipment worth 1 million euro and use it for 30 years..”. it’s hard to reasonformally about it

  35. A better idea ”The adversary has access to a Turing Machinethat can make at most 1030 steps.” We would need to specify exactly what we mean by a “Turing Machine”: • how many tapes does it have? • how does it access these tapes (maybe a “random access memory” is a more realistic model..) • ... More generally, we could have definitions of a type: “a system X is (t,ε)-secure if every Turing Machine that operates in timet can break it with probability at mostε.” This would be quite precise, but... Moreover, this approach often leads to ugly formulas...

  36. What to do? Idea: • t steps of a Turing Machine →“efficient computation” • ε→a value “very close to zero”. How to formalize it? Use the asymptotics!

  37. Efficiently computable? “efficiently computable” = “polynomial-time computable on a Turing Machine” that is: running in time O(nc) (for some c) Here we assume that the Turing Machinesare the right model for the real-life computation. Not true if a quantum computeris built...

  38. Very small? “very small” = “negligible” = approaches 0faster than the inverse of any polynomial Formally A function µ : N → R is negligible if

  39. Nice properties of these notions • A sum of two polynomials is a polynomial: poly + poly = poly • A product of two polynomials is a polynomial: poly * poly = poly • A sum of two negligible functions is a negligible function: negl + negl = negl Moreover: • A negligible function multiplied by a polynomial is negligible negl * poly = negl

  40. Security parameter The terms “negligible” and “polynomial” make sense only if X (and the adversary) take an additional input 1ncalled a security parameter. In other words: we consider an infinite sequence X(1),X(2),... of schemes. Typically, we will say that aschemeXis secure if A P (M breaks the scheme X) is negligible polynomial-time Turing MachineM

  41. Convention (in thislecture) security parameter n = the length of the secret key k in other words: k is always a random element of {0,1} Observation The adversary can always guessk with probability 2-n. This probability is negligible.

  42. Is this the right approach? Advantages • All types of Turing Machines are “equivalent” up to a “polynomial reduction”.Therefore we do need to specify the details of the model. • The formulas get much simpler. Disadvantage Asymptotic results don’t tell us anything about security of the concrete systems. However Usually one can prove formally an asymptotic result and then argue informally that “the constants are reasonable” (and can be calculated if one really wants).

  43. How to change the security definition? An encryption scheme is perfectly secretif for every m0,m1ЄM PEnc(K, m0) = PEnc(K, m1) we will require that m0,m1 are chosen by a poly-time adversary we will require that no poly-time adversary can distinguishEnc(K, m0) from Enc(K, m1)

  44. m0,m1 c (Enc,Dec)– an encryption scheme A game security parameter 1n adversary (polynomial-time Turing machine) oracle choosesm0,m1such that |m0|=|m1| • selectsk randomly from {0,1}n • chooses a random b = 0,1 • calculatesc := Enc(k,mb) has to guess b Alternative name: has indistinguishable encryptions (sometimes we will say: “is computationally-secure”, if the context is clear) Security definition: We say that (Enc,Dec)is semantially-secureif any polynomial timeadversary guesses bcorrectly with probability at most0.5 +ε(n),whereεis negligible.

  45. Testing the definition Suppose the adversary can compute k from Enc(k,m). Can he win the game? YES! Suppose the adversary can compute some bit ofm from Enc(k,m). Can he win the game? YES!

  46. Multiple messages In real-life applications we need to encrypt multiple messages with one key. The adversary may learn something about the key by looking at ciphertextsc1,...,ctof some messages m1,...,mt. How are these messages chosen? let’s say: the adversary can choose them!

  47. A chosen-plaintext attack (CPA) security parameter 1n • selects randomk Є{0,1}n • chooses a random b = 0,1 choosesm’1 m’1 oracle c1 = Enc(k,m’1) . . . chooses m’t m’t ct = Enc(m’t) challenge phase: chooses m0,m1 m0,m1 c = Enc(k,mb) the interaction continues . . . has to guess b

  48. CPA-security Alternative name:CPA-secure Security definition We say that (Enc,Dec)hasindistinguishable encryptions under a chosen-plaintext attack (CPA)if every randomized polynomial timeadversary guesses b correctly with probability at most 0.5 +ε(n), whereεis negligible. Observation • Every CPA-secure encryption has to be • randomized, or • “have a state”.

  49. CPA in real-life Q:Aren’t we too pessimistic? A:No!CPA can be implemented in practice. Example: routing Enck(m) k k m

  50. Is it possible to prove security? • (Enc,Dec)-- an encryption scheme. • For simplicity suppose that: • for a security parameter n the key is of length n. • Enc is deterministic Consider the following language: Q: What if L is polynomial-time decidable? A:Then the scheme is broken (exercise) Is it really true? On the other hand: L is in NP. (k is the NP-witness) So, if P = NP, then any semantically-secure encryption is broken.