1 / 37

Computer Security 3e

Computer Security 3e. Dieter Gollmann. www.wiley.com/college/gollmann. Chapter 13: Security Evaluation. Introduction. A company wants to protect its assets; it must decide what to do for security; this is an executive decision.

andream
Télécharger la présentation

Computer Security 3e

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computer Security 3e Dieter Gollmann www.wiley.com/college/gollmann

  2. Chapter 13:Security Evaluation

  3. Introduction • A company wants to protect its assets; it must decide what to do for security; this is an executive decision. • Decision makers are rarely security experts; they need advice on the functionality a security system should provide and advice on suitable products. • Advice can be a best practice recommendation for a business sector. • Tells decision makers in which direction to turn when selecting security services. • Advice can be a test report on security products. • Should tell about the findings of the tests, but also about the tests conducted.

  4. Agenda • Security Evaluation • History: Orange Book • Common Criteria • Evaluating Security Evaluation • Summary

  5. Security Evaluation – History • TCSEC (Orange Book) (1983): Criteria for US defense sector, predefined evaluation classes linking functionality and assurance. • ITSEC (1991): European criteria separating functionality and assurance so that very specific targets of evaluation can be specified and commercial needs can better addressed • TCSEC and ITSEC no longer in use. • Replaced by Common Criteria (CC) (1998): http://www.commoncriteriaportal.org/, http://csrc.nist.gov/cc/.

  6. Assurance & Functionality • As a user of a security sensitive system you need assurance that you are getting adequate security. • You need answers to two questions. • Does the system provide adequate protection for meeting my concrete security requirements? • This question refers to the functionality of the system. • Have the security services been implemented properly so that I can rely on them? • This question refers to the assurance that the system is providing the expected service.

  7. What is your question? • Evaluation: analysis of the level of security provided by a product, given generic security requirements. • Generic for the type of product evaluated. • Result of potential interest for larger number of customers. • Certification: analysis of a product with respect to the specific security requirements of an organisation. • Would this product be fit for purpose? • Accreditation: executive decision to go ahead and deploy specific products in an organisation. • Is our security system fit for purpose? • Result relevant for this organisation. • Terms & interpretations from TCSEC (Orange Book).

  8. Security Evaluation • How do you get assurance that your computer systems are adequately secure? • You could trust your software providers. • You could check the software yourself, but you would have to be a real expert. • You could rely on a security evaluation by an independent body. • Security evaluation schemes first launched in the 1980s; today the Common Criteria are used internationally.

  9. Framework for Security Evaluation • What is the target of the evaluation? • What is the purpose of an evaluation? • What is the method of the evaluation? • What is the organizational framework for the evaluation process? • What is the structure of the evaluation criteria? • What are the costs and benefits of evaluation?

  10. Target & Purpose • Target of evaluation: • Product: “off-the-shelf” software component to be used in a variety of applications; has to meet generic security requirements. • System: collection of products assembled to meet the specific requirements of a given application. • Purpose of evaluation: • Evaluation: assess whether a product has the security properties claimed for it. • Certification: assess suitability of a product (system) for a given application. • Accreditation: decide to use a certain system.

  11. Method & Structure • Method of evaluation: evaluation should not miss problems, different evaluations of the same product should give the same result. • Product oriented: examine and test the product; better at finding problems • Process oriented: check documentation and product development process; better for repeatable results • Repeatability and reproducibility often desired properties of an evaluation methodology. • Structure of evaluation criteria: • Functionality: security features • Effectiveness: are mechanisms used appropriate? • Assurance: thoroughness of analysis

  12. Framework, Costs, and Benefits • Organisational framework of evaluation: • Public service: evaluation by government agency can be slow; may be difficult to retain qualified staff. • Private service: how to make sure that customer pressure does not influence evaluation results • Interpretation drift: meaning of criteria may change over time and differ between evaluators. • Costs and benefits of evaluation: • Costs: fees paid for evaluation, indirect costs like employee time, impact on development process • Benefits: may be required, e.g. for government contracts, marketing argument, better security??

  13. TCSEC – Orange Book • Developed for the national security sector, but intended to be more generally applicable; provides • a yardstick for users to assess the degree of trust that can be placed in a computer security system, • guidance for manufacturers of computer security system, • a basis for specifying security requirements when acquiring a computer security system. • Security evaluation of the Trusted Computing Base (TCB), assumes that there is a reference monitor. • Developed for systems enforcing multi-level security. • High assurance linked to formal methods, simple TCBs, and structured design methodologies; complex systems tend to fall into the lower evaluation classes.

  14. TCSEC – Security Classes • Security classes defined incrementally; all requirements of one class automatically included in the requirements of all higher classes. • D: Minimal protection – no authentication, access control, … • C1: Discretionary security protection – support for discretionary access control, user identification/ authentication, tamper-resistant kernel, security tested and documented (e.g., classic Unix versions) • C2: Controlled access protection – adds object reuse, audit trail of object access, access control lists with single user granularity (e.g., Unix with some auditing extensions, Windows 2K in a special configuration).

  15. TCSEC – Division B • B1: Labeled security protection • Confidentiality labels for objects, mandatory access control policy, thorough security testing. • B2: Structured protection • Trusted path from user to TCB, • formal security policy model, accurate high-level description, • well-structured TCB and user interface, • Bandwidth estimation of covert storage channels, • system administration functions, • penetration testing • B3: Security domains • Security alarm mechanisms, • minimal TCB, • separation of system administrator & security administrator.

  16. TCSEC – Class A1 • A1: Verified design • formal model for security policy, • a Formal Top Level Specification (FTLS), • formal description of TCB must be proved to match the implementation, • strict protection of source code against unauthorised modification. • A1 rating for network components: MLS LAN (from Boeing) and Gemini Trusted Network Processor; SCOMP operating system.

  17. TCSEC – Evaluation • Complaint: TCSEC is too rigid. • If you want an assurance level higher than C2, you automatically get into multi-level security. • Reaction: ITSEC, where functionality and assurance could be defined separately. • Complaint: ITSEC is too flexible. • Vendors can make very specific security claims but their customers would find it difficult to compare different products.

  18. Rainbow Series • Orange Book is part of a collection of documents on security requirements, security management, and security evaluation published by NSA and NCSC (US National Security Agency and National Computer Security Center). • Documents in this series are known by the colour of their cover as the rainbow series. • Concepts from the Orange Book adapted to computer networks (Trusted Network Interpretation, Red Book), database management systems (Trusted Database Management System Interpretation, Lavender/Purple Book), etc.

  19. ITSEC • Information Technology Security Evaluation Criteria: harmonization of Dutch, English, French, German national security evaluation criteria; endorsed by the Council of the European Union in 1995. • Builds on lessons learned from using TCSEC; intended as a framework for security evaluation that can deal with new security requirements. • Breaks the link between functionality and assurance. • Applies to security products and to security systems. • The sponsor of the evaluation determines the operational requirements and threats.

  20. ITSEC • Security objectives for the Target of Evaluation (TOE) further depend on laws and regulations; establish the required security functionality and evaluation level. • Security target specifies all aspects of TOE that are relevant for evaluation: security functionality of TOE, envisaged threats, objectives, and details of security mechanisms to be used. • Security functions of TOE may be specified individually or by reference to a predefined functionality class. • Seven evaluation levels E0 to E6 express the level of confidence in the correctness of the implementation of security functions.

  21. US Federal Criteria • Evaluation of products, linkage between function and assurance in the definition of evaluation classes. • Protection profiles to overcome the rigid structure of the Orange Book; five sections of a protection profile: • Descriptive Elements: ‘name’ of protection profile, description of the problem to be solved. • Rationale: justification of the protection profile, including threat, environment, and usage assumptions, some guidance on the security policies that can be supported. • Functional Requirements: protection boundary that must be provided by the product. • Development Assurance Requirements. • Evaluation Assurance Requirements: type and intensity of the evaluation.

  22. Common Criteria • Criteria for the security evaluation of products or systems, called the Target of Evaluation (TOE). • Protection Profile (PP): a (re-usable) set of security requirements, including an EAL; should be developed by user communities to capture typical protection requirements. • Security Target (ST): expresses security requirements for a specific TOE, e.g. by reference to a PP; basis for any evaluation. • Evaluation Assurance Level (EAL): define what has to be done in an evaluation; there are seven hierarchically ordered EALs.

  23. CC Protection Profile PP introduction PP identification PP overview TOE description Assumptions Threats TOE security environment Organisational security policies Security objectives for TOE Security objectives Security objectives for environment TOE security functional requirements IT security requirements TOE security requirements Security requirements For the IT environment TOE security assurance requirements PP application notes Rationale Security objectives rationale Security requirements rationale

  24. Protection Profiles – Examples • Firewall with strict requirements • Smart Card Security User Group - Smart Card Protection Profile (SCSUG-SCPP) • Smartcard Integrated Circuit Protection Profile • Smartcard embedded software • Smart Card IC with Multi-Application Secure Platform • Contact and Contact free Electronic Wallet • Automatic Cash Dispensers / Teller • Postage Meter • Single-Level Operating Systems in Medium Robustness Environments • Multi-Level Operating Systems in Medium Robustness Environments • Public Key Infrastructure and Key Management Infrastructure Token • Secure Signature-Creation Device Type 1 (and Type 2, Type 3) • Trusted Computing Platform Alliance Trusted Platform Module • Oracle Government Database Management System

  25. CC Assurance Levels • EAL1 - functionally tested • EAL2 - structurally tested • EAL3 - methodically tested and checked • EAL4 - methodically designed, tested, and reviewed • EAL5 - semiformally designed and tested • EAL6 - semiformally verified design and tested • EAL7 - formally verified design and tested

  26. CC Assurance Levels • EAL1 – Tester receives target of evaluation, reads documentation and performs some tests to confirm documented functionality; outlay for evaluation should be minimal. • EAL2 – Developer provides in addition test documentation and vulnerability analysis for review by evaluator, who will repeat some of these tests; effort required from the developer is small. • EAL3 – Developer uses configuration management, documents security arrangements for development, provides high-level design documentation and documentation on test coverage for review. • For developers who already follow good development practices but do not want to further change their practices.

  27. CC Assurance Levels • EAL4 – Developer provides low-level design and subset of security functions (TCB) source code for evaluation, secure delivery procedures, evaluator performs independent vulnerability analysis. • Usually EAL4 is the highest level that is economically feasible for an existing product line. • EAL5 – Developer provides formal model of security policy, semiformal high-level design and functional specification as well as full source code of security functions, covert channel analysis, independent penetration testing. • TOE designed and developed with the intent of achieving EAL5; additional evaluation costs ought not to be large.

  28. CC Assurance Levels • EAL6 – Source code must be well structured, access control implementation must have low complexity (reference monitor), intensive penetration testing; cost of evaluation expected to increase. • EAL7 – Developer provides formal functional specification and high-level design, demonstrates or proves correspondence between all representations of security functions, security functions must be simple enough for formal analysis. • EAL7 typically only achieved with a TOE that has a tightly focused security functionality and is amenable to extensive formal analysis.

  29. CC Evaluated Operating Systems • EAL4: Sun Solaris (TM) 8 Operating Environment • EAL4: HP-UX (11i) Version 11.11 • EAL4+: AIX 5L for POWER V5.2 Programm Number 5765-E62 • EAL3: SGI Trusted IRIX/CMW Version 6.5.13 • EAL4+: Windows 2000 Professional, Server, and Advanced Server with SP3 and Q326886 Hotfix   • EAL4: B1/EST-X Version 2.0.1 with AIX, Version 4.3.1 • EAL4: Sun Trusted Solaris Version 8 4/01 • EAL4+: Windows 2000 Professional, Server, and Advanced Server with SP3 and Q326886 Hotfix (OS) • EAL3: SGI IRIX/CMW Version 6.5.13

  30. Mutual Recognition • Common Evaluation Methodology (CEM): set of steps for validating the assurance requirements in a Security Target. • Common Criteria Recognition Agreement (CCRA): recognition of evaluations performed in another country. • CEM addresses assurance levels EAL1 to EAL4; only assurance levels that are mutually recognized. • Higher assurance levels are only automatically accepted within a single country.

  31. Quality Standards • Ultimate step towards audit-based evaluation: assess how a product is developed but not the product itself. • A company would become a ‘certified producer of secure systems’. • This approach is popular in the area of quality control: organisations follow the ISO 9000 standard on internal quality management and external quality assurance to vouch for the quality of their products. • Some vendors claim that being registered under an ISO 9000 quality seal is a better selling argument than a security certificate for a particular product and that security evaluation should move in this direction.

  32. Quality Standards • Such a proposal is attractive for companies developing secure systems: the costs of evaluation are much reduced. • If the developers of secure systems win in this proposal, will the users of secure systems lose out? • This is not a foregone conclusion; certificate is no guarantee that a system cannot be broken. • You have to assess each evaluation scheme on its own merits to decide whether individually evaluated products offer more security than products from accredited developers.

  33. Evaluating Security Evaluation • Success story: smart cards. • Industry-led design of a protection profile, high assurance evaluation has revealed flaws in products. • Industry comment: Investment in evaluation has been worth the effort. • Contributing factors: • security a primary application; • relatively fixed functionality (evaluation results not immediately out of date); • market demand; • high assurance evaluation detecting flaws.

  34. Evaluating Security Evaluation • Not a success story: operating systems. • Complex and evolving functionality; code base from different sources; high assurance evaluation hardly feasible; lower assurance evaluation hardly finds flaws. • How many security advisories on Windows/Linux have come out of Common Criteria evaluations? • Industry comment (user): quality assurance of vendors more telling than evaluation of their products. • Contributing factors: • security competes with other criteria; • moving target, evaluations almost by default out of date; • interaction with other software components (e.g. browsers); • wide range of user requirements.

  35. Evaluating Security Evaluation • TCSEC designed for evaluating operating systems, i.e. an infrastructure component. • Networking was an optional feature in the 1980s; not covered by TCSEC. • Infrastructure was managed by professionals. • Users were just “users”. • Security is moving to the application layer: • The browser is the new infrastructure for the web. • There is no protection profile for web browsers! • Vulnerable application software can be exploited even when it runs on secure platforms.

  36. Evaluating Security – The Future? • Target is moving to the application layer. • In terms of IT: SQL injection, XSS, XSRF, JavaScript Hijacking all exploit flaws in application software. • In terms of organisation: how to manage security (components). • Applications are heavily customized. • We need a rapid and adaptable methodology for evaluating application software. • When moving to the application layer, security is moving closer to the end user. • Security evaluation has to consider unsophisticated end users when assessing usability.

  37. Security Evaluation – Summary • Security evaluation has been required in some countries by public sector customers. • Major O/S and DBMS vendors offer evaluated products. • Outside the government sector there has been little enthusiasm for evaluated products. • Current exception: smart card software. • Persistent problem: products keeps evolving so evaluation often refers to a version no longer in use.

More Related