1 / 50

Systematic Testing and Verification of Security Policies

Systematic Testing and Verification of Security Policies. Tao Xie Department of Computer Science North Carolina State University https://sites.google.com/site/asergrp/projects/policy. Joint Work with Vincent Hu, Rick Khun , and ACTS group (NIST)

betty
Télécharger la présentation

Systematic Testing and Verification of Security Policies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Systematic Testing and Verification of Security Policies Tao Xie Department of Computer Science North Carolina State University https://sites.google.com/site/asergrp/projects/policy Joint Work with Vincent Hu, Rick Khun, and ACTS group (NIST) JeeHyun Hwang, Evan Martin (NCSU), Alex Liu (MSU)

  2. Motivation • Digital information is • Easy to access • Easy to search • Sensitive information requires access control mechanisms • Security policies are popularly in access control • Access control policies for applications • Firewall policies for networks

  3. Motivation - cont. • How to ensure the correct specification of security policies? • What you specify is what you get, but not necessarily what you want • Solution: systematic testing and verification of security policies

  4. Example Access Control Policy • Subjects: Student, Faculty • Actions: Assign, Receive • Resources: Grades Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny

  5. Policy Verification • Verify policy against specified property What properties can you come up for this policy? Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny

  6. Policy Verification Property: student can never assign grades Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny Violated with a counterexample request: faculty|student assign grades

  7. Policy Verification “when the specification language is sufficiently declarative, users have great difficulty providing a duplicate statement of behavior.” --- Shriram Krishnamurthi [RiseandRise 08] Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny

  8. Our Approaches • Systematic policy verification • Property inference [POLICY 06, SSIRI 09, DBSec 10] • Property-quality assessment [ACASC 08] • Properties derived from access control models [POLICY 10DE] • Systematic policy testing • Structural coverage criteria [ICICS 06] • Fault models/mutation testing [WWW 07] • Test generation [SESS 07] • Policy engine performance [SIGMETRICS 08, TC] • Policy engine correctness [TAV-WEB 08] • Firewall policy testing/fixing [SRDS 08/09, LISA 10] • XACML policies • XACML engines • Firewall policies

  9. XACML • A standard access control policy language used to express access control policies • who can do what when • A request/response language used to express • queries about whether access should be allowed (requests) and • answers to those queries (responses) • http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=xacml

  10. A Policy Set holds other policies or policy sets. A policy is expressed as a set of rules. Rules have targets and a set of conditions that determine if the rule applies to a given request. Both rule and policy combining algorithms exist to reconcile conflicts. XACML Policy Structure target policy target target rule1 rule2 cond1 cond2 • http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=xacml

  11. A Simple Scenario • A Subject who wishes to perform an Action on a Resource must do so through a PEP. • The PEP forms the XACML request and sends it to the PDP. • The PDP checks the request against the Policy and returns an XACML response. • The PEP either Permits or Denies access to the resource.

  12. Test Inputs Test Outputs Expected Outputs Program Software Testing Policy Testing Software Testing Requests Responses Expected Responses Policy Policy Testing

  13. Research Problems and Solutions • Test generation - Request generation • Policy Coverage Criteria • Random request generation • Request generation based on change-impact analysis • Mutation testing to assess fault-detection capability • Test-result inspection - Response inspection • Request selection and minimization based on structural coverage

  14. Structural Policy Coverage Criteria policy covered if target matches target policy rule2 covered if target matches target target rule1 rule2 Condition must evaluate to True and False to be covered entirely cond1 cond2

  15. 1 0 0 0 0 0 1 1 0 1 Random Request Generation • The example policy: • Subjects: Student, Faculty • Actions: Assign, Receive • Resources: Grades • Model the set of attribute values as a vector of bits and randomize the bits Student Faculty Assign Receive Grades

  16. Cirg: Change-Impact Request Generation counterexamples policy change-impact analysis 3. requestgeneration policy versions version synthesis requests

  17. Cirg Example IF (faculty AND assign AND grades) ELSE IF (student AND receive AND grades) Permit ELSE Deny • Counter-example • faculty, assign, grades : Permit  Deny Deny Permit

  18. Synthesized Versions Rationale: synthesize two versions whose differences are coverage targets • All-to-Empty • One-to-Empty • One-Increment • All-to-Minus-One • All-to-Change-One-Effect

  19. Margrave – Change-Impact Analysis Tool Multi-Terminal Decision Diagrams • Faculty (f) can assign (a) grades (g) • Students (s) can receive (r) grades (g) [Fisler et al. ICSE 05]

  20. Margrave Sample Output 1:/Subject, role, Faculty/ 2:/Subject, role, Student/ 3:/Resource, resource-class, ExternalGrades/ 4:/Resource, resource-class, InternalGrades/ 5:/Action, command, Assign/ 6:/Action, command, View/ 7:/Action, command, Receive/ 8:/Subject, role, TA/ 12345678 { 00010101 N->P 00011001 N->P 00100101 N->P 00101001 N->P 01010101 N->P 01011001 N->P 01100101 N->P 01101001 N->P }

  21. Software Mutation Testing Test Inputs Test Outputs Program Mutation Operators Mutator Differ? Mutant Killed! Mutant Outputs Mutant Program

  22. Policy Mutation Testing Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy

  23. Components of Mutation Testing Framework Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy

  24. Research Questions • Does test selection based on structural coverage criteria produce request sets with high fault-detection capability? • What are the individual characteristics of each mutation operator? • Are some more difficult to kill than others? • Are some easily killed by request sets selected based on structural coverage criteria?

  25. Sample Policies • continue: 51 policies, 56 rules

  26. # of Requests Generated and Selected • continue: 373 (cirg), 500 (random), 32 (reduction)

  27. Coverage Results • continue: 32% RuleCov (random) vs. 98% RuleCov(cirg)

  28. Mutation Operators, Mutation, and Equivalent Mutant Detection Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy

  29. Mutation Operators • Each operator mutates a different policy element: policy set, policy, rule, condition, and/or their associated targets and effects.

  30. Change Rule Effect (CRE) Example IF (faculty AND assign AND grades) ELSE IF (student AND receive AND grades) Permit ELSE Deny • The CRE mutation operator is performed on each rule and changes the decision effect (Permit  Deny) Deny Permit

  31. Equivalent Mutant Detection • An equivalent mutant is semantically equivalent although syntactically different than the original policy. • They provide no value and waste resources. • We use change-impact analysis to detect equivalent mutants and remove them.

  32. Request Evaluation and Mutant Detection Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy

  33. Sun’s XACML implementation • An open source implementation of the XACML standard in Java • Developed by Sun as part of an ongoing project on Internet Authorization in the Internet Security Research Group • http://sunxacml.sourceforge.net/

  34. Mutant-Killing Ratios by Subject

  35. Mutant-Killing Ratio by Operator

  36. Our Approaches • Systematic policy verification • Property inference [POLICY 06, SSIRI 09, DBSec 10] • Property-quality assessment [ACASC 08] • Properties derived from access control models [POLICY 10DE] • Systematic policy testing • Structural coverage criteria [ICICS 06] • Fault models/mutation testing [WWW 07] • Test generation [SESS 07] • Policy engine performance [SIGMETRICS 08, TC] • Policy engine correctness [TAV-WEB 08] • Firewall policy testing/fixing [SRDS 08/09, LISA 10] • XACML policies • XACML engines • Firewall policies

  37. Firewall Policy Structure • A Policy is expressed as a set of rules. • ARule is represented as <predicate> → <decision> • <predicate>is a set of <clauses> • An example firewall policy A range in each field refers <clause> Rule r1’s <decision> Rule r1’s <predicate>

  38. Structural Coverage Definition Rationale: when the policy part with a fault is not evaluated (i.e., “covered”), the fault is often not exposed. Rule coverage of a policy P by packets T = #rules evaluated by at least one packet in T #rules in P Predicate coverage of a policy P by packets T #predicates evaluated to true or false by T at least once 2 ×#predicates in P Clause coverage of a policy P by packets T #clauses evaluated to true or false by T at least once 2 ×#clauses in P

  39. Test Packet Generation Our objective: generating packets for achieving high structural coverage Random Packet Generation Randomly selects values for a packet Packet Generation based on Local Constraint Solving Considering individual rules in a policy Packet Generation based on Global Constraint Solving Considering multiple rules in a policy

  40. Experiments (measuring coverage) • Test 14 firewall policies • Generate packets by our proposed three techniques • Measure structural coverage.

  41. Experiments (measuring coverage)

  42. Experiments (measuring fault detection capability) • We also used reduced packet sets (maintaining the same level of structural coverage with the corresponding original packet set)

  43. NCSU/NIST ACPT Architecture • GUI allows specification of users, groups, attributes, roles, rules, policies, and resources • Administrator • API/mechanism to consume/acquire external data related to policies • GUI • User, • attribute, • resource, • role, • etc. data Data Acquisition AC Model Templates XACML Generate enforceable policies • Verify access control policies Policy Generator Static Verification .xml • Generate • test inputs Generate and evaluate test inputs • Test inputs based • on structural or • combinatorial coverage Test inputs with their evaluated decisions Dynamic Verification • http://www.nist.gov/itl/csd/set/acpt.cfm

  44. ACPT Property specification in ACPT 44

  45. Static Verification Verify the property against Policy A, the result return false with counterexample. 45

  46. Static Verification (cont.) Verify the property against Policy B, the result return true. 46

  47. Test Input Generation and Evaluation 47

  48. XACML Generation 48

  49. Conclusion • Systematic policy verification • Property inference [POLICY 06, SSIRI 09, DBSec 10] • Property-quality assessment [ACASC 08] • Properties derived from access control models [POLICY 10DE] • Systematic policy testing • Structural coverage criteria [ICICS 06] • Fault models/mutation testing [WWW 07] • Test generation [SESS 07] • Policy engine performance [SIGMETRICS 08, TC] • Policy engine correctness [TAV-WEB 08] • Firewall policy testing/fixing [SRDS 08/09, LISA 10] • XACML policies • XACML engines • Firewall policies

  50. Questions?https://sites.google.com/site/asergrp/projects/policyQuestions?https://sites.google.com/site/asergrp/projects/policy

More Related