1 / 55

Trusted OS Design

Trusted OS Design. CS461/ECE422 Spring 2008. Administrative. Mid-term II is re-scheduled to 6:30 – 7:45pm on Thursday April 10, in Everitt Hall 151 HW #4 is out, due on March 13, 3:30pm. Reading Material. Section 5.4 of Security in Computing. Overview. Design Principles

jontae
Télécharger la présentation

Trusted OS Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trusted OS Design CS461/ECE422 Spring 2008

  2. Administrative • Mid-term II is re-scheduled to 6:30 – 7:45pm on Thursday April 10, in Everitt Hall 151 • HW #4 is out, due on March 13, 3:30pm

  3. Reading Material • Section 5.4 of Security in Computing

  4. Overview • Design Principles • Security Features • Kernelized Design • Virtualization

  5. Design Principles • Simplicity • Less to go wrong • Fewer possible inconsistencies • Easy to understand • Restriction • Minimize access • Inhibit communication Saltzer and Schroeder 75

  6. Economy of Mechanism • Keep the design as simple and small as possible • Simpler means less can go wrong • And when errors occur, they are easier to understand and fix • Interfaces and interactions

  7. Fail-Safe Defaults • Base access decisions on permission rather than exclusion • Burden of proof is on the principal seeking permission • If the protection system fails, then legitimate access is denied but illegitimate access is also denied

  8. Complete Mediation • Every access to every object must be checked for authority • Usually done once, on first action • UNIX: access checked on open, not checked thereafter • If permissions change after, may get unauthorized access • Proposals to gain performance by remembering the result of an authority check should be examined skeptically

  9. Open Design • The design should not be secret • Do not depend on secrecy of design or implementation • Popularly misunderstood to mean that source code should be public • “Security through obscurity” • Does not apply to information such as passwords or cryptographic keys

  10. Separation of Privilege • Where feasible, a protection mechanism that requires two keys to unlock it is more robust and flexible than one that allows access to the presenter of only a single key. • Require multiple conditions to grant privilege • Separation of duty • Defense in depth

  11. Least Privilege • Every program and every user of the system should operate using the least set of privileges necessary to complete the job • A subject should be given only those privileges necessary to complete its task • Function, not identity, controls • Rights added as needed, discarded after use • Minimal protection domain

  12. Least Common Mechanism • Minimize the amount of mechanism common to more than one user and depended on by all users • Mechanisms should not be shared • Information can flow along shared channels • Covert channels • Isolation • Virtual machines • Sandboxes

  13. Psychological Acceptability • It is essential that the human interface be designed for ease of use so that users routinely and automatically accept the protection mechanisms correctly • Security mechanisms should not add to difficulty of accessing resource • Hide complexity introduced by security mechanisms • Ease of installation, configuration, use • Human factors critical here

  14. Security Features • Identification and Authentication • MAC vs DAC • Object Reuse Protection • Prevent leaks via reallocation • Clean before re-use

  15. More Security Features • Complete Mediation • Mediate all means of access • File access plus direct memory access if possible • Mediate on each access, not generally done for files

  16. More Security Features • Trusted Path • Give end user means to determine they are really talking with OS • Secure Attention Key (SAK): key sequence that cannot be intercepted by non-OS • Ctl-Alt-Del in Windows • Rootkit… • Or security relevant changes only made during system boot • What about networked applications?

  17. More Security Features • Audit • Must be able to review and recreate security relevant changes • Must protect log • Log growth • Originally assumed security officer would review directly • Can by used for backing evidence • Really want to detect anomalies • Intrusion detection

  18. Kernelized design • Contain security feature implementation in a security kernel • Coverage • Separation • Unity • Modifiability • Compactness • Verifiability User Space OS Kernel Security Kernel

  19. Reference Monitor • Reference Monitor – abstract machine that mediates all access to objects by subjects • Reference Validation Mechanism (RVM) – Implementation of a Reference Monitor • Tamper-proof • Well defined • Never bypassed • Small enough for analysis and testing

  20. Trusted Computing Base (TCB) • Includes all protection mechanisms including HW, firmware, and software responsible for enforcing the security policy • Strong boundary around the TCB is critical • Any code trusted by element of TCB must be part of TCB too. • If portion of TCB is corrupted, must consider that all of the TCB can be corrupted

  21. TCB Components • TCB can include • Hardware • Primitive files • Authentication info • Access Control info • Protected Memory • For Reference Monitor Execution • Some inter-process communication

  22. TCB/non-TCB Function Split

  23. TCB Implementation • Ideally TCB a separate security kernel • e.g. SCOMP, 10K lines of code in security kernel • Generally not feasible for retrofitted kernel • Most all trusted Unix variants • Security relevant functionality distributed through OS kernel

  24. Virtualization • Can design virtualization layer to separate multiple users • Memory virtualization • As exemplified by IBM MVS • Virtual machines • Book discusses IBM PR/SM • More recently exemplified in VMWare and XEN • Malicious program could not access other virtual memory space or machine • Unless they attack virtualization mechanism

  25. Memory Virtualization

  26. Machine Virtualization

  27. Key Points • Principles of secure design underlie all security-related mechanisms • Require: • Good understanding of goal of mechanism and environment in which it is to be used • Careful analysis and design • Careful implementation

  28. Information Assurance CS461/ECE422 Spring 2008 Evaluating Systems

  29. Reading Material • Chapter 5.5 of Security in Computing • The orange book and the whole rainbow series • http://www.radium.ncsc.mil/tpep/library/rainbow/ • The common criteria • Lists all evaluated protection profiles and products • http://www.commoncriteriaportal.org

  30. Outline • Motivation for system evaluation • Specific evaluation systems • TCSEC/Orange Book • Interim systems • Common Criteria

  31. Evaluation Goals • Oriented to purchaser/user of system • Assurance that system operates as advertised

  32. Evaluation Options • Rely on vendor/developer evidence • Self-evaluate vendor design docs, test results, etc • Base on reputation of vendor • Rely on an expert • Read product evaluations from trusted source • Penetration testing

  33. Evaluation Options • Formal Verification • Validation • Requirements checking • Design and Code review • System Testing

  34. Formal Evaluation • Provide a systematic framework for system evaluation • More consistent evaluation • Better basis for comparing similar product • Trusted third party system for evaluation • Originally driven by needs of government and military

  35. TCSEC: 1983-1999 • Trusted Computer System Evaluation Criteria (TCSEC) also called the Orange Book • Specifies evaluation classes (D, C1, C2, B1, B2, B3, A1) • Specifies functionality and assurance requirements for each class • Functional Model builds on • BLP (Bell-LaPadula model, mandatory labelling) • Reference Monitors

  36. TCSEC Functional Requirements • DAC • Object Reuse • Sufficient clearing of objects between uses in resource pool • E.g. zero pages in memory system • MAC and Labels • Identification and Authentication • Audit • requirements increase at higher classes • Trusted Path • Non-spoofable means to interact with TCB • Ctl-Alt-Del in Windows

  37. TCSEC Assurance Requirements • Configuration Management • For TCB • Trusted Distribution • Integrity of mapping between master and installations • System Architecture • Small and modular • Design Specification – vary between classes • Verification – Vary between classes • Testing • Product Documentation

  38. TCSEC Classes • D – Minimal Protection • C1 – Discretionary Protection • Identification and authentication and DAC • users processing data at common sensitivity level, separates users from data • Minimal Assurance, may be based on features, not evaluation • C2 – Control access protection • Adds object reuse and auditing • More testing requirements • Windows NT 3.5 evaluated C2

  39. TCSEC Classes • B1 – Labeled Security Protection • Adds MAC for some objects • Controlled objects “labeled”, access control based on these • Stronger testing requirements. Information model of security policy. Bell-LaPadula model. • Trusted Unixes tended to be B1 • B2 – Structured protection • Design and implementation must enable thorough testing & review • “well-defined largely independent modules” • MAC for all objects, including devices. Additional logging. Trusted Path. Least privilege. • Covert channel analysis, configuration management, more documentation, formal model of security policy

  40. TCSEC Classes • B3 – Security Domains • Requirements on code modularity, layering, simplicity. • Argument (short of proof) that implementation meets design specifications • Tamper-proof implementation, “highly resistant to penetration” • More stringent testing and documentation. • A1 – verified protection • Same functional requirements as B3 • Five criteria • Formal model of protection and proofs of consistency/adequacy • Formal specification fo protection system • Demonstration that specification corresponds to model of protection • “proof” that implementation is consistent with specification • Formal analysis of covert channel • Existence proof : Honeywell’s SCOMP

  41. TCSEC Evaluation process • Originally controlled by government • No fee to vendor • May reject evaluation application if product not of interest to government • Later introduced fee-based evaluation labs • Evaluation phases • Design analysis – no source code access • Test analysis • Final review

  42. TCSEC Evaluation Issues • Focused on operating systems • Evaluating a specific configuration • E.g., Window NT, no applications installed, no network • New patches, versions require re-certification • Ratings Maintenance Program introduced to ease re-certifications • Incremental changes documented, re-evaluated • Long time for evaluation • Sometimes product was obsolete before evaluation finished • Criteria Creep • B1 means something more in 1999 than it did in 1989

  43. Interim Efforts in the ’90s • Canadian Trusted Computer Product Evaluation Criteria (CTCPEC) • Information Technology Security Evaluation Criteria (ITSEC) – Western Europe • Commercial International Security Requirements (CISR) – AmEx and EDS • Federal Criteria – NSA and NIST

  44. FIPS 140 • Federal Information Processing Standards • Framework for evaluating Cryptographic Modules • Still in Use • Addresses • Functionality • Assurance • Physical security • Level 1 - algorithm be FIPS approved, can run on COTS device • Level 2 - physical security, role-based auth. , s/w crypto in multiprocessors • Level 3 - enhanced physical security. • Level 4 - physical tamper detection/response. Level 3 and 4 devices may be used with suitably well criterianed OS

  45. Common Criteria – 1998 to today • Pulls together international evaluation efforts • Evaluations mean something between countries • Three top level documents • Common Criteria Documents • Describe functional and assurance requirements. Defines Evaluation Assurance Levels (EALs) • CC Evaluation Methodology (CEM) • More details on the valuation. Complete through EAL5 (at least) • Evaluation Scheme • National specific rules for how CC evals are performed in that country • Directed by NIST in US

  46. CC Terminology • Target of Evaluation (TOE) • The product being evaluated • TOE Security Policy (TSP) • Rules that regulate how assets are managed, protected, and distributed in a product • TOE Security Functions (TSF) • Implementation of the TSP (all hardware, software, firmware relied upon for the correct enforcement of TSP) CC evaluates “protection profiles”, and products/systems against a pre-defined (or user-defined) Evaluation Assurance Level (EAL)

  47. Protection Profile (PP) • Profile that describes the security requirements for a class of products • Implementation-independent, targets products or systems for specific consumer needs • Stated in terms of threats, environmental issues and assumptions, security objectives. • List of PP’s http://www.commoncriteriaportal.org/pp.html • Replaces the fixed set of classes from TCSEC • ISSO created some initial profiles to match TCSEC classes • Controlled Access Protection Profile (CAPP) corresponds to C2 • Labeled Security Protection Profile (LSPP) corresponds to B1

  48. PP Format A PP has 6 sections • Introduction : PP identification , overview (narrative summary) • Product or System Family Description : type and general IT features. Context of use. • Product or System Family Security Environment : assumptions about use and environment. Threats requiring protection. Organization policies required. • Security Objectives : Two types. For product/system : trace objectives to specified threats and policies. For environment: traced to threats not countered by product or by assumptions about product. • IT Security Objectives : Functional (drawn from CC, or other). Security Assurance : based on an EAL • Rationale : Two parts. Objectives: trace stated objectives to all assumptions, threats, organizational policies. Requirements : show are traceable to objectives, and meet them.

  49. Product Evaluation • Define a security target (ST) • Structured very much like a PP, except with more implementation specificity • May leverage an evaluated protection profile • Evaluated with respect to the ST

  50. CC Functional Requirements • Defined in a taxonomy • Top level 11 classes • E.g., FAU – Security audit and FDP – User Data Protection • Each class divided into families • E.g., FDP_ACC – Access control policy • Each family divided into components • E.g., FDP_ACC.2 – Complete access control • Each component contains requirements and dependencies on other requirements

More Related