1 / 54

POLIPO: Policies & OntoLogies for Interoperability, Portability, and autOnomy

POLIPO: Policies & OntoLogies for Interoperability, Portability, and autOnomy. Daniel Trivellato. Outline. Problem Definition Approach POLIPO Language requirements Policy language syntax Reputation system Credential Chain Discovery Algorithm. Example Scenario.

Télécharger la présentation

POLIPO: Policies & OntoLogies for Interoperability, Portability, and autOnomy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. POLIPO: Policies & OntoLogies for Interoperability, Portability, and autOnomy Daniel Trivellato

  2. Outline • Problem Definition • Approach • POLIPO • Language requirements • Policy language syntax • Reputation system • Credential Chain Discovery Algorithm

  3. Example Scenario NATO surveillance mission GBR Goals CANADA USA NATO Definitions read if Senior Officer Senior Officer is an Officer with at least 10 years of service Aaahhhhhh!!!! Senior Officer???  ITA

  4. Problem Definition (1/2) • Goal: Situational awareness in a System of Systems • independent, heterogeneous components •  DISTRIBUTED AUTHORITY • MUTUAL UNDERSTANDING • dynamic (re-)configurations (join and leave) •  AVAILABILITY • ACCOUNTABILITY

  5. Problem Definition (2/2) • Security goals: • protection of sensitive data from unauthorized disclosure, using content- and context-aware security policies • secure interaction between (possibly untrusted) parties of dynamic coalitions • interoperability between heterogeneous systems and policy models, tuning local policies to ensure global security

  6. Proposed Solutions • Access Control to specify the permissions of subjects on objects • Trust Management to establish trust between unknown parties • Ontologies to enable mutual-understanding

  7. Ontologies (1/2) • Formally represent domain knowledge • Define concepts, instances and (binary) relationships in a domain • Constraints allow to infer information not explicitly stated • Each ontology can refer to concepts defined in another ontology (reusability) NATO:Allied Country MO:Officer MO:worksFor NL PSD:Junior Officer PSD:Senior Officer Jack John

  8. Ontologies (2/2) • Ontologies can be used to give semanticsto predicates in rules • Ontologies can also be used to align AC models • However, in a distributed system … • two entities may refer to the same object with different names • two entities may use the same name to refer to different objects

  9. The POLIPO Framework

  10. Application Domains • Semantic Web • Data protection on the web • Business Processes for Web Services • Virtual organizations • Maritime Safety and Security (MSS) • Healthcare • Business to Business (B2B)

  11. Language Requirements • Requirement 1: INTEROPERABILITY • Requirement 2: AUTONOMY • Requirement 3: PORTABILITY

  12. R1 - Interoperability Parties shall be able to interact with each other unambiguously Ontologies denote the semantics of concepts and relationships in the domain

  13. R2 - Autonomy Every party shall be able to design and express its policy autonomously A party must be able to specify its policy independently from the actions and definitions of other parties

  14. Example • Global ontology Officer Junior Senior DISJOINT Officer Officer Temporary Temporary Party 1 Party 2 Officer Officer  Localextensions to the global ontology • Mappings from local to global concepts WHO DOES THE MAPPINGS? HOW DO WE GUARANTEE THEIR CORRECTNESS?

  15. R3 - Portability Remote evaluation of policies shall preserve the interpretation of the policy owner • Remote policy evaluations should not grant any permission that would not be granted by a local evaluation • Use credentials to preserve interpretation of the policy owner

  16. Language Syntax • Atoms • Atoms are used to build rules • Sets of rules make policies

  17. Syntax: Basic Constructs • Ontology atoms: queries to the knowledge base, represented by an ontology • e.g., psd:SeniorOfficer(‘John’) psd:worksFor(‘John’,’BS’) • Credential atoms • e.g., cred(‘BS’,’psd:SeniorOfficer’,’John’, [(‘psd:validUntil’,’31/12/2009’]) • Authorization atoms • e.g., perm(‘psd:read’, ‘John’, ‘File’) • Constraints: built-ins or user-defined predicates • e.g., X = Y + 3, aboutSuveillance(‘File’)

  18. Syntax: Rules • Horn clauses of the form h  b1,…,bn • h (head) is an atom • b1,…,bn (body) are literals (i.e. positive or negative atoms) • Negation is treated as negation as failure • Safety condition: each variable in h, in a negative literal, or in a built-in also occurs in a positive body literal

  19. Credential Release Rules • The head is a credential atom • The body can contain positive credential and ontology atoms, and constraints Example: cred(‘BS’,‘psd:SeniorOfficer’,X,[])  psd:SeniorOfficer(X)

  20. Authorization Rules • The head is an authorization atom • The body can contain positive credential, authorization and ontology atoms, constraints, and negative ontology and constraints Example: perm(‘psd:read’,X,Y)  aboutSurveillance(Y), cred(‘BS’,‘psd:SeniorOfficer’,X,[])

  21. Constraint Definition Rules • The head is a user-defined predicate • The body can contain positive ontology atoms and constraints Example: aboutSurveillance(X)  bs:aboutMission(X,‘Surveillance’), bs:sensitivityLevel(X,Y), Y<3

  22. Policies • Credential Release Policy: set of credential release rules • Authorization Policy: set of authorization rules

  23. Problems… • Local models may not match the global ontology model • Global terms might be too coarse-grained to describe a specific domain • Policies need precise definitions to guarantee security within a domain • A complete and precise vocabulary alignment is costly • Not feasible in short- and mid-term cooperation

  24. Problems… GBR Officer OF-4 OF-3 OF-1 OF-2 ITA

  25. …and Solution • Local terms to provide fine-grained definitions • Flexible mapping of • local to global terms • local to local terms • MORE AUTONOMY • INTEROPERABILITY • AVOID CONFLICTING DEFINITIONS

  26. Ontology Alignment (1/2) Officer GBR Admiral Captain Commodore Lieutenant Goals read if OF-3 Ufficiale ITA Generale Tenente Colonnello Maggiore Capitano

  27. Ontology Alignment (2/2) • Mapping local to global concepts is necessary for mutual-understanding • Mapping local to local concepts is also a possibility • However, mappings can be imprecise • no 100% equivalent concepts • entities have different mapping capabilities • Who performs the mapping? How? How do we know if we can trust it?

  28. TM + Reputation System • Extend ontology-based TM with a reputation system • every peer can define a mapping between two concepts • the trustworthiness (reputation) of a peer depends on the affinity of its opinions with those of the other peers • the final mapping is obtained by combining subjective opinions of peers based on their reputation

  29. Mapping two Concepts • Expressed by similarity credentials • e.g., sim(GBR,’Captain’,’SeniorOfficer’, [(degree,0.7),(timeStamp,2009/09/09)]) • Reflects inequality between concepts • Signed  non-repudiation • Similarity Credentials Repository • Exchanged through gossip protocols • More entities can express the similarity about the same concepts • contrasting opinions • which one should be considered?

  30. Naïve approach • Combine all the opinions • the average similarity degree is the “correct” one • Not all peers are equally trustworthy • Similarity statements discriminated according to peer’s reputation

  31. Reputation • Reflects the accuracy of the similarity statements of a peer • Based on agreement with other peers • The agreement between two peers is proportional to the affinity of their similarity statements • Steps to compute reputation • For each pair of comparable similarity statements, compute their affinity • For each pair of peers, compute their agreement • Compute the reputation of all peers

  32. Affinity • Measures the level of correspondence between non-contradicting statements • st is a local similarity threshold that establishes when two statements are contradictory

  33. Local Similarity Threshold • Low values of st increase the number of statements considered • High values of st lead to a more accurate identification of trustworthy peers

  34. Agreement • Agreement values represented as a matrix • Updated when new credentials are acquired

  35. Computing Reputation • The reputation of a peer is a value in [0,1] • It is based on its agreement with the other peers, weighted by their reputation • The formula converges after t iterations • α is used to bias the computation on the initial reputation and guarantees convergence • More details in the paper…

  36. Example • for st = 0.6 • Order of navies: WS, BS, GC, GS • Initial reputation: 1, 0, 0, 0 • Final reputation values: 0.81, 0.70, 0.89, 0.14

  37. Reputation-based Similarity • Computes similarity of attributes based on similarity statements • Weighted by the reputation of the issuer • Excluding opinions of untrustworthy peers • rt is a reputation threshold. Similarity credentials of peers with reputation lower than rt are discarded

  38. TM + Reputation System • Similarity can be exploited in rules • Peers may accept credentials about any attributes similar to a given attribute • perm(read,X,File1)  cred(GBR,Ally,Y), cred(Y,Z,X), similar(0.5,Z,Captain) ≥ 0.6 • A peer can express policies just with known vocabulary  AUTONOMY • Peers are able interpret unknown terms by similarity  INTEROPERABILITY

  39. Credential Chain Discovery • Credentials must be derived on request • To derive a credential c a peer needs to collect all the credentials on which c depends • Where do we find them? Who performs all the computations? • We need an algorithm to define a storage schema and a retrieval method

  40. The RT algorithms • 3 algorithms: • Backward search: top-down • Forward search: bottom-up • Bi-directional search • Designed to answer different query types • Work if some requirements about credential storage location are satisfied

  41. Query Types • 3 possible query types • Type 1: cred(TU/e,student,Alice)? • Type 2: cred(TU/e,student,X)? • Type 3: cred(X,Y,Alice)? • Where do we start searching?

  42. Credential Storage • Query: Is Bart employee of an accredited university? • All credentials stored by the issuer • Ask for all accredited universities • Ask to each university if Bart is a student • All credentials stored by the subject • Ask Bart all credentials • Ask to all issuers for entailed credentials… • Bart has 1000 credentials, 900 confidential… • Combine the two…

  43. But… • Consider • cred(TU/e,student,X)  cred(PD,student,X) • cred(PD,stud,Bart) • Query: Is Bart a TU/e student? • Now, what happens if both credentials are stored by the PD? • We cannot answer the query as we do not know where to start from

  44. Well-typed Credentials • We need to regulate where credentials can be stored • Credential and credential rules must be well-typed • Only if credentials are well-typed all the solutions can be retrieved • More details in the paper…

  45. Backward Search Algorithm • Top-down • Credentials stored by the issuer! • Build a graph in which nodes are labeled by roles • Each node gets a “list of participants” • Advantages • Goal-directed • Decentralized

  46. Example cred(DSA,student,X)  cred(DG,accredited,Y), cred(Y,student,X) cred(DG,accredited,TU/e) cred(DG,accredited,UT) cred(DG,accredited,UvA) cred(DG,educationalInstitution,TU/e) cred(WUA,qualityInstitution,TU/e) cred(TU/e,student,X)  cred(PD,student,X) cred(PD,student,Alice) cred(PD,student,Bart) cred(PD,student,Charlie) cred(ABN,client,Bart) cred(VISA,ccard,Bart)

  47. Example Query: cred(DSA,student,Bart)? DG Accredited TU/e UT UvA PD student Alice DSA student Bart TU/e student Alice Charlie Alice Bart Bart Charlie ……… Charlie UT student ……… ……… UvA student ………

  48. Forward Search Algorithm • Bottom-up • Credentials stored by the subject! • Build a graph in which nodes are labeled by roles or principals • Each node gets a “list of roles it participates to or it is a subset of” • Disadvantages: • privacy issues!

  49. Example cred(DSA,student,X)  cred(DG,accredited,Y), cred(Y,student,X) cred(DG,accredited,TU/e) cred(DG,accredited,UT) cred(DG,accredited,UvA) cred(DG,educationalInstitution,TU/e) cred(WUA,qualityInstitution,TU/e) cred(TU/e,student,X)  cred(PD,student,X) cred(PD,student,Alice) cred(PD,student,Bart) cred(PD,student,Charlie) cred(ABN,client,Bart) cred(VISA,ccard,Bart)

More Related