1 / 48

563.14.1 Tamper Resistant Architecture: Decentralized Label Model for Information Flow Control

563.14.1 Tamper Resistant Architecture: Decentralized Label Model for Information Flow Control. Presented by: Soumyadeb Mitra PISCES Group: Soumyadeb Mitra, Sruthi Bandhakavi, Ragib Hasan, Raman Sharikyn University of Illinois Spring 2006. Motivation for Decentralized Label Models.

Télécharger la présentation

563.14.1 Tamper Resistant Architecture: Decentralized Label Model for Information Flow Control

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 563.14.1 Tamper Resistant Architecture:Decentralized Label Model for Information Flow Control Presented by: Soumyadeb Mitra PISCES Group: Soumyadeb Mitra, Sruthi Bandhakavi, Ragib Hasan, Raman Sharikyn University of Illinois Spring 2006

  2. Motivation for Decentralized Label Models • Security models have two goals • Prevent malicious destruction of information • Control release and propagation of information • Traditional security models: • Access control lists, capabilities • First goal supported • Second • Information release can be restricted • Information propagation is not well supported Myers&Liskov

  3. Motivating Example • Java applet • Downloaded from remote site and run locally • Code not trustworthy • Security assurances • Restrict malicious transfer of information • No way to control information propagation • Current approach: Sandbox • Too restrictive • Possible Solution • Control information flow

  4. read(ƒ) - ƒ: {user: user} Java Applet write(socket,z) socket: {:anyone} The Basic Idea • Assign security labels to data • Who created it? • Who is allowed to see it? • Track data flowing through the system • Check violations z = ƒ

  5. Main Entities • Principals, representing users • Who create data • Values • Computation manipulate values • Slots • Variables/objects acting as source and sink of values • Channels • Input/Output : Values obtained from input channel and written to output channel • Values/Slots/Channels have security labels associated with them

  6. Labels • Label • L = {own:reader1, reader2} • Owner • Principal who is the source of the information • Readers • Principals whom the owner is willing to release data • Values/Slots/Channels have labels • Restriction on the kind of assignment (x:=v) • Discussed later

  7. Derived Labels • During computation, values are derived from other values. • The derived value must contain “information” about its source • Example: • x: { Alice: P, Q } • y: { Bob: Q, R } • z = x+y: { Alice: P, Q ; Bob : Q, R} • Both restrictions apply • Effective reader : Q

  8. Derived Labels • Label of zis a join of labels of xand y • L1 U L2 • owners(L1 U L2) = owners(L1)U owners(L2) • reader(L1 U L2 ,o) = readers(L1 , o) ∩ readers(L2 , o) • Example • x: { Alice: P, Q } • y: { Alice: P; Bob: P } • z = x+y: { Alice: P ; Bob : P}

  9. Restriction on Assignment • x:=v • We allow a value to be assigned to a slot only if it is a restriction • A restriction, intuitively means higher security • Examples : The following are disallowed • Lv = { Alice: P,Q}, Lx = { Alice: P,Q,R} • Rcan get access to v through x • Lx = { Bob: P }, Lv = { Alice: Q} • Alicelooses control over v • Pgets access to data

  10. Restriction • Definition: Restriction • L1 ⊑L2 iff • readers(o, L1) readers(o, L2) • owners (L1) owners (L2) • Examples • {Alice: X,Y,Z} ⊑{Alice: X} • {Alice: X,Y,Z; Bob : X,Y} ⊑ {Alice: X; Bob :Y} • {Alice: X} ⊑ {Alice: X; Bob :Y} • Assignment Rule • x:= v → Lv⊑Lx ∩I I∩

  11. Declassification • Sometimes you want to reveal data un-secure channel Alice’s clear text Encrypt Send (string cleartext {Alice:Alice})authority(Alice){ Send (string cleartext {Alice:Alice}) { ……… encryptext {Alice:Alice} = Encrypt(cleartext); ……… channel {:anyone} = declassify(encryptext) channel {:anyone} = encryptext -> Violated User must explicitly authorize Send to declassify

  12. Implicit Information Flow • Lb⊑Lx • Implicit information flow • The assignment x:=1depends on the value of b • Extra constraint: Lb⊑Lx x:=0 if (b) x:=1 x:=b

  13. Implicit Information Flow • Define labels associated with Program Counter • Lpc= U{ Lv: v was used arrive at pc } • x:=v • Lv U Lpc⊑Lx • Previous example x:=0 if (b) x:=1 Lb⊑ Lx Lpc= Lb

  14. Other Information Flows • Termination Channels • Timing Channels • Resource Exhaustion Channels • Power Channels Sabelfeld and Myers

  15. Confidentiality Constraints Eve1 Eve2 System Eve3 Alice Eve4 Eve5 Guarantee that Alice’s data is not released to Eve

  16. Integrity Constraints Eve1 Eve2 System Eve3 Alice Eve4 Eve5 Guarantee that data Alice receives is not corrupted

  17. Basic Idea • Assign integrity labels with data • x: {? : a, b, c} • a, b and ctrust the data x • x:=v • Anyone trusting x must also trust v • TrustSet(x) TrustSet(v) • Lv ⊑Lx • Define join of Integrity constraints I∩

  18. Details • Inference and Verification • Compile time checking • Some data items assigned labels • Labels of others derived to satisfy constraints • Lpc depends on other variable’s labels, which in turn might depend on Lpc • Formulate a set of equations and solve simultaneously

  19. 563.14.2 Tamper Resistant ArchitectureSecure Program Partitioning Sruthi Bandhakavi PISCES Group: Soumyadev Mitra, Sruthi Bandhakavi, Ragib Hasan, Raman Sharikyn University of Illinois

  20. PISCES • Protocols and Implementation for Smart Card Enabled Software • Focus on two technologies • Information flow • Can we split the code by taking the information flow in the programs into consideration? • Model-based design • Can we find a high-level model to represent the programs and use it to automatically split and produce code?

  21. Partitioning Jif Programs • Solve a constraint system to determine possible hosts • Use dynamic programming & heuristics to find an efficient solution • Rewrite program, inserting calls to runtime system • data forwarding and control transfers • Outputs: A collection of Java code fragments with host assignments Zdancewic Zheng Nystrorm Myers

  22. Source Code Policy Compiler Splitter Secure Program Partitioning Trust info subprograms runtime Host 1 Host 2 Host 3

  23. Source Code Policy Compiler Splitter Secure Program Partitioning Describes the computation and the principals' security policies. Trust info subprograms runtime Host 1 Host 2 Host 3

  24. Source Code Policy Compiler Splitter Secure Program Partitioning Verifies that the program obeys the security policies. Trust info subprograms runtime Host 1 Host 2 Host 3

  25. Source Code Policy Compiler Splitter Secure Program Partitioning Describes the trust relationships between principals and hosts. Trust info subprograms runtime Host 1 Host 2 Host 3

  26. Source Code Policy Compiler Splitter Secure Program Partitioning Trust info Partitions the data and computation among hosts, so that policies are obeyed. subprograms runtime Host 1 Host 2 Host 3

  27. Source Code Policy Compiler Splitter Secure Program Partitioning Trust info Performs dynamic access control checks and encrypts communication. subprograms runtime Host 1 Host 2 Host 3

  28. Security Assurance • Goal: Resulting distributed program performs the same computation as the source and also satisfies the security policies. • Guarantee: Principal P's security policy is violated only if a host that P trusts fails or is subverted. • Example: A B "Alice trusts A & C" "Bob trusts B & C" C If B fails, Alice's policy is obeyed, Bob's policy may be violated.

  29. Compiler Splitter Secure Program Partitioning Source Code Policy Trust info subprograms runtime Host 1 Host 2 Host 3

  30. Confidentiality Policies in Jif • Confidentiality labels:int{Alice} a; "a is Alice's private int" • Integrity labels:int{?Alice} a; "Alice must trust a" • Combined labels:int{Alice, ?Alice} a; (Both constraints) // Insecure a1 = b; b = a1; int{Alice} a1, a2; int{Bob} b; // Secure a1 = a2;

  31. Policy Operations in Jif • Declassification: int{Alice} a;declassify(a to Bob); • Endorse: int{?Bob} b;endorse(b by Alice); • But (!) Alice must trust the integrity of decision to perform the policy operation. • Compiler guarantees the integrity "type-cast int{Alice} to int{Bob}"

  32. Example: Oblivious Transfer request(n) Alice Bob int m1; int m2; answer(mn) • Alice has two integers: m1 and m2. • Alice's Policy:"Bob gets to choose exactly one of m1 and m2." • Bob's Policy:"Alice doesn't get to know which item I request." • Classic Result: "Impossible to solve using 2 principals, with perfect security."

  33. Oblivious Transfer (Java) int m1, m2; // Alice's data boolean accessed; int n, ans; // Bob's data n = choose(); // Bob's choice if (!accessed) { // Transfer accessed = true; if (n== 1) ans = m1; else ans = m2; }

  34. Adding Confidentiality Labels int{Alice} m1, m2; // Alice's data boolean accessed; int{Bob} n, ans; // Bob's data n = choose(); // Bob's choice if (!accessed) { // Transfer accessed = true; if (n== 1) ans = m1; else ans = m2; } Verification Fails

  35. Using Declassification int{Alice} m1, m2; // Alice's data boolean accessed; int{Bob} n, ans; // Bob's data n = choose(); // Bob's choice if (!accessed) { // Transfer accessed = true; if (n== 1) ans = declassify(m1 to Bob); else ans = declassify(m2 to Bob); } Verification Fails

  36. Integrity Constraints int{Alice} m1, m2; // Alice's data boolean{?Alice} accessed; int{Bob} n, ans; // Bob's data n = choose(); // Bob's choice if (!accessed) { // Transfer accessed = true; if (n== 1) ans = declassify(m1 to Bob); else ans = declassify(m2 to Bob); } Verification Fails

  37. Using Endorsement int{Alice} m1, m2; // Alice's data boolean{?Alice} accessed; int{Bob} n, ans; // Bob's data n = choose(); // Bob's choice if (!accessed) { // Transfer accessed = true; if (endorse(n by Alice) == 1) ans = declassify(m1 to Bob); else ans = declassify(m2 to Bob); }

  38. Compiler Splitter Secure Program Partitioning Source Code Policy Trust info subprograms runtime Host 1 Host 2 Host 3

  39. Trust Configurations • Labels describe the trust relationship between principals and the available hosts. • Confidentiality: Host A:{Alice}"Alice trusts host A not to leak her confidential data." • Integrity: Host A:{?Alice}"Alice trusts host A not to corrupt her high-integrity data." int{Alice} m1; m1 can be sent to Aint{Bob} n; n cannot be sent to A

  40. Host Selection • Consider a field: int{Alice:;?:Alice} f; • Host H : confidentiality label Ch integrity label Ih • Constraints: {Alice:}  Ch Ih {?:Alice} • Generalize to program statements: C (values used by S)  Ch l (Locations defined by S)  Ih • Constraints on declassify()

  41. A Secure Solution A T B bool accessed; int m1, m2; int n, ans; goto(B); int choose() { ... return n; } goto(A); if (!accessed){ accessed=true; goto(T); } int n'= get(n,B); if (n' == 1) set(ans, m1); else ... n = choose(); goto(T); {Alice, ?Alice} {Alice, ?Alice, Bob} {Bob, ?Bob}

  42. Secure Program Partitioning • Language-based Confidentiality Policies • Compiler splits a program among heterogeneously trusted hosts. • Guided by security policies • Resulting distributed program satisfies the policies • Benefits: • End-to-end security • Decentralized • Automatic • Explicit Policies

  43. Our Project • Extend the same concept to an implementation-independent model. • EFSM are very simple and can model a large number of systems • Our model of EFSMs • Set of states • Each state is either • x:=v. GOTO nextstate • if (P) GOTO state1 else GOTO state2 • Special variables • in, out

  44. Example ESFM S0 n=in S1 if (isAccessed) 1 0 isAccessed=0 S2 if (n) S3 1 0 S4 out=declassify(m1) out=declassify(m2) S5 END

  45. Security Labels & Type Checking • All variables • Confidentiality constraint Cx • Integrity constraint Ix • States also have confidentiality constraints • Cstate = U { Cv : state depends on v} • Istate= U { Iv : state depends on v} • x:=v • Cv U Cstate ⊑Cx • Iv UIstate ⊑Ix

  46. Mapping states to hosts • Each state mapped to some host • x:=v can be mapped to h if • Cx⊑ Ch • Ih⊑ Ix • if (P) GOTO s0 ELSE s1 • Can be mapped to h if • CP U Cstate⊑ Ch • IP ⊑ Ih

  47. S0 B, T n=in A S1 if (isAccessed) 1 A 0 isAccessed=0 S2 T if (n) S3 1 0 T T S4 out=declassify(m1) out=declassify(m2) S5 END Example EFSM annotated CA= { Alice: ? Alice} CB= {Bob: ? Bob} CT= {Alice: Bob:} m1,m2, isAccessed : {Alice: ? Alice} n { Bob:} in {Bob: }, out { Bob: }

  48. Splitting of EFSM n=in if (isAccessed) B isAccessed=0 T if (n) A END out=declassify(m1) out=declassify(m2)

More Related