1 / 27

A Game-Theoretic Perspective on Oblivious Transfer

A Game-Theoretic Perspective on Oblivious Transfer . Kenji Yasunaga (ISIT) Joint work with Haruna Higo, Akihiro Yamada, Keisuke Tanaka (Tokyo Inst. of Tech.). 2012.5.8 Crypto Seminar @ Kyusyu University. Cryptography and Game Theory.

anthea
Télécharger la présentation

A Game-Theoretic Perspective on Oblivious Transfer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Game-Theoretic Perspective on Oblivious Transfer Kenji Yasunaga (ISIT) Joint work with Haruna Higo, Akihiro Yamada, Keisuke Tanaka (Tokyo Inst. of Tech.) 2012.5.8 Crypto Seminar @ Kyusyu University

  2. Cryptography and Game Theory • Cryptography:Design protocols in the presence of adversaries • Game theory:Study the behavior of rational players

  3. Cryptography and Game Theory • Cryptography:Design protocols in the presence of adversaries • Game theory:Study the behavior of rational players • Rational cryptography:Design cryptographic protocols for rational players • Rational Secret Sharing [HT04, GK06, ADG+06, KN08a, KN08b, MS09, OPRV09, FKN10, AL11]

  4. Asharov, Canetti, Hazay (Eurocrypt 2011) • Game-theoretically characterize properties of two-party protocols • Protocol π satisfies a “certain” property A “certain” game defined by π has a “certain” solution concept with “certain” utility functions • Properties: Correctness, Privacy, Fairness • Adversary model: Fail-stop adversaries • Equivalent defs. for correctness and privacy • New def. for fairness

  5. This work • Game-theoretically characterize properties of “two-message” Oblivious Transfer (OT) • Advantages compared to [ACH11] • Game between two rational players • Essentially played by a single player in [ACH11] • Characterize correctness and privacy by a single game • Malicious adversaries

  6. Oblivious Transfer • A protocol between sender S and receiver R • Correctness: After running the protocol,R obtains xc and S obtains nothing (or ⊥) • Privacy • Privacy for S: R learns nothing about x1-c • Privacy for R: S learns nothing about c OT c ∈ {0,1} x0, x1 ・・・ ・・・ R ⊥ xc S

  7. Why “two-message” OT ? Two-message OT Msg1 c ∈ {0,1} x0, x1 R ⊥ xc S Msg2

  8. Why “two-message” OT ? Two-message OT • IND based privacy fits for GT framework • Utility is high  Prediction is correct • Exit IND based privacyfor two-message OT Msg1 c ∈ {0,1} x0, x1 R ⊥ xc S Msg2

  9. Our results • Protocol π for two-message OT satisfies “correctness” and “privacy” A “certain” game defined by π has a “certain” solution concept with “certain” utility functions

  10. Our results • Protocol π for two-message OT satisfies “correctness” and “privacy” A “certain” game defined by π has a “certain” solution concept with “certain” utility functions A game Gameπ defined by π has a Nash equilibriumwith utility functions U = (US, UR)

  11. Cryptographic Correctness of OT • Protocol π = (S, R) Correctness • ∀ x0, x1 ∈ {0,1}*s.t. |x0| = |x1|, c ∈ {0,1},Pr[ outputR(S(x0, x1), R(c)) = xc ] ≥ 1 − negl

  12. Cryptographic Privacy of two-message OT Privacy for R • ∀ PPT S*and x0, x1 ∈ {0,1}*, {viewS*(S*(x0, x1), R(0))} =c {viewS*(S*(x0, x1), R(1))} Privacy for S • ∃ a function Choice: {0,1}*  {0,1}s.t.∀ determ. poly-time R*, x0, x1, x, z ∈ {0,1}*, c ∈ {0,1}, {viewR*(S(X0), R*(c, z))} =c {viewR*(S(X1), R*(c, z))}where X0 = (x0, x1), and X1 = (x0, x) if Choice(R*, c, z) = 0, X1 = (x, x1) otherwise • Choice indicates the choice bit of R*

  13. Gameπ • Protocol: π = (Sπ, Rπ), Input: x0, x1, x, z ∈ {0,1}*, c ∈ {0,1}Players: Sender (S, GS), Receiver (R, GR) • Gameπ((S, GS), (R, GR), Choice, x0, x1, x, c, zS, zR): • X0 = (x0, x1), X1= (x0, x) if Choice(R, c, z) = 0, X1 = (x, x1) o.w. • b R {0, 1} and set zto be empty if R = Rπ • Execute (S(Xb), R(c, z)) ( outputR)Set fin = 1  Protocol finished without abort • GS guesses c from viewS ( guessS) GR guesses b from viewR ( guessR) • Output (fin, outputR, guessS, guessR)

  14. Utility functions U = (US, UR) • US((S, GS), (R, GR))= (− αS)・( Pr[ guessR = b ] − 1/2 )+ βS・( Pr[fin=0 ∨ (fin=1 ∧ outputR = xc)] − 1) + γS・( Pr[ guessS = c ] − 1/2 ) • αS, βS, γS are some positive constants • US is low if GR’s guess is correct or finish w/o abort and output is incorrector GS’s guess is incorrect • UR((S, GS), (R, GR))= (− αR)・( Pr[ guessS= c ] − 1/2 )+ βR・( Pr[ fin=0 ∨ (fin=1 ∧ outputR = xc)] − 1) + γR・( Pr[ guessR= b ] − 1/2 )

  15. Nash equilibrium • Protocol (S, R) is aNash equilibrium for Gameπ  ∃ Choice s.t. ∀ PPT GS, GR, S*, (determ.) R*, ∀ x0, x1, x, z ∈ {0,1}*, c ∈ {0,1}, US((S*,GS), (R,GR)) ≤ US((S,GS), (R,GR)) + negl and UR((S,GS), (R*,GR)) ≤ UR((S,GS), (R,GR)) + negl

  16. Game-theoretic characterization • Main Theorem:Protocol π = (Sπ, Rπ) for two-message OT satisfies cryptographic correctnessand privacyif and only ifπ = (Sπ, Rπ)is a Nash equilibrium for Gameπwith utility functions U = (US, UR)

  17. Proof (“Crypto  Game”) Assume π is not game-theoretically secure  π = (Sπ, Rπ) is not NE for Gameπ  ∀Choice, ∃GS*, GR*, S*, R*, x0, x1, x, z, cs.t. • Case 1: US((S*,GS), (Rπ,GR)) > US((Sπ,GS), (Rπ,GR)) + εS or • Case 2: UR((Sπ,GS), (R*,GR)) > UR((Sπ,GS), (Rπ,GR)) + εR

  18. Proof (“Crypto  Game”) • Case 1: US((S*,GS), (Rπ,GR)) > US((Sπ,GS), (Rπ,GR)) + εS • Recall thatUS((Sπ, GS), (Rπ, GR))= (− αS)・( Pr[ guessR = b ] − 1/2 )+ βS・( Pr[ fin=0 ∨ (fin=1 ∧ outputR = xc)] − 1) + γS・( Pr[ guessS = c ] − 1/2 ) • When Sπ S*Case 1-a: Pr[ guessR = b ] is lowerCase 1-b: Pr[ fin=0 ∨ (fin=1 ∧ outputR = xc)] is higherCase 1-c: Pr[ guessS = c ] is higher

  19. Proof (“Crypto  Game”) • Case 1-a: Pr[ guessR = b ] is lower  Since Pr[ guessR = b ] ≤ 1/2 + negl when S*, (Rπ, GR) breaks the privacy for S • Case 1-b: Pr[fin=0 ∨ (fin=1 ∧ outputR=xc)] is higher  Pr[fin=0 ∨ (fin=1 ∧ outputR=xc)] < 1 −ε when Sπ Not cryptographically correct • Case 1-c: Pr[ guessS = c ] is higher  Pr[ guessS = c ] ≠ 1/2 ± neglwhen S* (S*, GS) breaks the privacy for R

  20. Proof (“Game  Crypto”) Assume π is not cryptographically secure  • Case 1: Not cryptographically correct • Case 2: Cryptographically correct • Case 2-a: Not private for S when Rπ • Case 2-b: Private for S when Rπ, not private for R • Case 2-c: Private for R, not private for S when R*

  21. Proof (“Game  Crypto”) • Case 1: Not cryptographically correct  ∃ x0, x1, c s.t.Pr[ outputR = xc] < 1 − ε1  US((Sπ, GS), (Rπ, GR)) < − βS・ε1 US((Sdef, GS), (Rπ, GR)) = 0  US is higher when Sπ  Sdef • Sdef: Abort before start • Pr[fin=0 ∨ (fin=1 ∧ outputR=xc)] is higherwhen Sπ  Sdef

  22. Proof (“Game  Crypto”) • Case 2: Cryptographically correct • Case 2-a: Not private for S when Rπ  ∃ D1 who distinguishes viewRπ  US((Sπ, GS), (Rπ, GR)) < −αS・ε2US((Sstop, GS), (Rπ, GR)) = 0 (when GR uses D1)  US is higher when Sπ  Sstop • Sstop: Abort after receiving a message • Pr[ guessR = b ] is higher when Sπ  Sstop

  23. Proof (“Game  Crypto”) • Case 2: Cryptographically correct • Case 2-b: Private for S when Rπ, not for R  ∃ S* and D2 who distinguishes viewS*  ∃ D2who distinguishes viewSπ (since two-message OT)  UR((Sπ,GS), (Rπ,GR)) < −αR・ε3UR((Sπ,GS), (Rdef,GR)) = 0 (when GSuses D2) • Rdef: Abort before start • Pr[ guessS = c ] is higher when Rπ Rdef

  24. Proof (“Game  Crypto”) • Case 2: Cryptographically correct • Case 2-c: Private for R, not for S when R*  ∃ R* and D3who distinguishes viewR*  UR((Sπ,GS), (Rπ,GR)) < neglUR((Sπ,GS), (R*,GR)) = γR・ε4(when GRuses D3)  UR is higher when Rπ  R* • Pr[ guessR = b ] is higher when Rπ  R*

  25. Notes • Main theorem holds even ifγS = 0 or βR = 0 • US((S, GS), (R, GR))= (− αS)・( Pr[ guessR = b ] − 1/2 )+ βS・( Pr[ fin=0 ∨ (fin=1 ∧ outputR = xc)] − 1) + γS・( Pr[ guessS = c ] − 1/2 ) • UR((S, GS), (R, GR))= (− αR)・( Pr[ guessS = c ] − 1/2 )+ βR・( Pr[ fin=0 ∨ (fin=1 ∧ outputR = xc)] − 1) + γR・( Pr[ guessR = b ] − 1/2 )

  26. Conclusions (1/2) • Game-theoretically characterize “two-message” OT Protocol π = (Sπ, Rπ) for two-message OT satisfies cryptographic correctness and privacy  π = (Sπ, Rπ) is a Nash equilibrium for Gameπ with utility functions U = (US, UR) • Advantages compared to [ACH ‘11] • Game between two rational players • Characterize correctness and privacy by a single game • Malicious adversaries

  27. Conclusions (2/2) • The first step toward understanding how OT protocols work for rational players • Future work • Characterize OT with the ideal/real simulation-based security • Characterize other protocols • Explore good examples of rational cryptography

More Related