1 / 56

CSCE 548 Secure System Standards Risk Management

CSCE 548 Secure System Standards Risk Management. Announcement. Job Openings: Daniel Rusu, Dreamgol, LLC drusu9@gmail.com 803-727-5634 www.dreamgol.com. Announcements. Job openings: Peter J. Johnson, Staffing Consultant 978.927.7000 (m) / premagni@verizon.net (e)

feng
Télécharger la présentation

CSCE 548 Secure System Standards Risk Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCE 548 Secure System StandardsRisk Management

  2. Announcement • Job Openings: • Daniel Rusu, Dreamgol, LLC • drusu9@gmail.com • 803-727-5634 • www.dreamgol.com

  3. Announcements • Job openings: • Peter J. Johnson, Staffing Consultant • 978.927.7000 (m) / premagni@verizon.net (e) • www.linkedin.com/in/peterjjohnson (linkedin) • Massachusetts, Maryland, Virginia, and Ohio for computer scientists and software engineers with credentials in the fields of security, networking and privacy • US citizenship

  4. Announcement • Job Openings: • Charleston, system development • C, Python, and Java, Linux, specifically Fedora 14, cellular communications, electrical engineering or the basic principles, and mysql • Android, iPhone / iPad developers to build the follow on versions of a current geo-location / SMS application • Contact info upon request

  5. Project • Requirements available at http://www.cse.sc.edu/~farkas/csce548-2012/csce548-project-requirements.htm • Useful links: • OWASP, Open Web Application Security Project, https://www.owasp.org/index.php/Main_Page • Sample projects

  6. Homework 1 Choose a team member among your class mates. This selection is for this exercise only. List the steps of RMF for the "KillerAppCo's iWare 1.0 Server" given in your text book. (3 points) Carry out similar RMF on the computing resources owned by your team member. For example, understand the "business" context may include goals like graduating from USC, making profit from writing software to a company, etc. Document your RMF activities and findings. (7 points) BONUS points (2 points): Have your partner evaluate your risk management report and comment on it.

  7. Reading • This lecture: • McGraw: Chapter 2 • Recommended: • Rainbow Series Library, http://www.fas.org/irp/nsa/rainbow.htm • Common Criteria, http://www.commoncriteriaportal.org/ • Next lecture: • Software Development Lifecycle – Dr. J. Vidal

  8. Threats RISK Vulnerabilities Consequences Risk Assessment

  9. Financial Loss Dollar Amount Losses by Type Total Loss (2006): $53,494,290 CSI/FBI Computer Crime and Security Survey Computer Security Institute

  10. Percentage of IT Budget Spent on Security Percentage of Organizations Using ROI, NPV, or IRR Metrics CSI/FBI Computer Crime and Security Survey Computer Security Institute Security Protection

  11. Real Cost of Cyber Attack • Damage of the target may not reflect the real amount of damage • Services may rely on the attacked service, causing a cascading and escalating damage • Need: support for decision makers to • Evaluate risk and consequences of cyber attacks • Support methods to prevent, deter, and mitigate consequences of attacks

  12. System Security Engineering (Traditional View) Specify System Architecture Identify and Install Safeguards Identify Threats, Vulnerabilities, Attacks Prioritize Vulnerabilities Estimate Risk Risk is acceptably low

  13. Carry Out Fixes and Validate Identify Business and Technical Risks Define Risk Mitigation Strategy Synthesize and Rank Risks Measurement and Reporting Risk Management Framework (Business Context) Understand Business Context

  14. Understand the Business Context • “Who cares?” • Identify business goals, priorities and circumstances, e.g., • Increasing revenue • Meeting service-level agreements • Reducing development cost • Generating high return investment • Identify software risk to consider

  15. Identify Business and Technical Risks • “Why should business care?” • Business risk • Direct threat • Indirect threat • Consequences • Financial loss • Loss of reputation • Violation of customer or regulatory constraints • Liability • Tying technical risks to the business context in a meaningful way

  16. Synthesize and Rank the Risks • “What should be done first?” • Prioritization of identified risks based on business goals • Allocating resources • Risk metrics: • Risk likelihood • Risk impact • Risk severity • Number of emerging risks

  17. Define the Risk Mitigation Strategy • “How to mitigate risks?” • Available technology and resources • Constrained by the business context: what can the organization afford, integrate, and understand • Need validation techniques

  18. Carry Out Fixes and Validate • Perform actions defined in the previous stage • Measure “completeness” against the risk mitigation strategy • Progress against risk • Remaining risks • Assurance of mechanisms • Testing

  19. Measuring and Reporting • Continuous and consistent identification and storage of risk information over time • Maintain risk information at all stages of risk management • Establish measurements, e.g., • Number of risks, severity of risks, cost of mitigation, etc.

  20. Assets-Threat Model (1) • Threats compromise assets • Threats have a probability of occurrence and severity of effect • Assets have values • Assets are vulnerable to threats Threats Assets

  21. Assets-Threat Model (2) • Risk: expected loss from the threat against an asset • R=V*P*S • R risk • V value of asset • P probability of occurrence of threat • V vulnerability of the asset to the threat

  22. System-Failure Model • Estimate probability of highly undesirable events • Risk: likelihood of undesirable outcome Threat Undesirable outcome System

  23. Risk Acceptance • Certification • How well the system meet the security requirements (technical) • Accreditation • Management’s approval of automated system (administrative)

  24. Next slides are recommended only

  25. Incident Handling Computer Security Incident Handling Guide, Recommendations of the National Institute of Standards and Technology http://csrc.nist.gov/publications/nistpubs/800-61-rev1/SP800-61rev1.pdf

  26. How to Response? • Actions to avoid further loss from intrusion • Terminate intrusion and protect against reoccurrence • Law enforcement – prosecute • Enhance defensive security • Reconstructive methods based on: • Time period of intrusion • Changes made by legitimate users during the effected period • Regular backups, audit trail based detection of effected components, semantic based recovery, minimal roll-back for recovery.

  27. Roles and Responsibilities • User: • Vigilant for unusual behavior • Report incidents • Manager: • Awareness training • Policies and procedures • System administration: • Install safeguards • Monitor system • Respond to incidents, including preservation of evidences

  28. Computer Incident Response Team • Assist in handling security incidents • Formal • Informal • Incident reporting and dissemination of incident information • Computer Security Officer • Coordinate computer security efforts • Others: law enforcement coordinator, investigative support, media relations, etc.

  29. Incident Response Process 1. Preparation • Baseline Protection • Planning and guidance • Roles and Responsibilities – Training • Incident response team

  30. Incident Response Process 2. Identification and assessment • Symptoms • Nature of incident • Identify perpetrator, origin and extent of attack • Can be done during attack or after the attack • Gather evidences • Key stroke monitoring, honey nets, system logs, network traffic, etc. • Legislations on Monitoring! • Report on preliminary findings

  31. Incident Response Process 3. Containment • Reduce the chance of spread of incident • Determine sensitive data • Terminate suspicious connections, personnel, applications, etc. • Move critical computing services • Handle human aspects, e.g., perception management, panic, etc.

  32. Incident Response Process 4. Eradication • Determine and remove cause of incident if economically feasible • Improve defenses, software, hardware, middleware, physical security, etc. • Increase awareness and training • Perform vulnerability analysis

  33. Incident Response Process 5. Recovery • Determine course of action • Reestablish system functionality • Reporting and notifications • Documentation of incident handling and evidence preservation

  34. Follow Up Procedures • Incident evaluation: • Quality of incident (preparation, time to response, tools used, evaluation of response, etc.) • Cost of incident (monetary cost, disruption, lost data, hardware damage, etc.) • Preparing report • Revise policies and procedures

  35. Security Awareness and Training • Major weakness: users unawareness • Organizational effort • Educational effort • Customer training • Federal Trade Commission: program to educate customers about web scams

  36. Building It Secure • 1960s: US Department of Defense (DoD) risk of unsecured information systems • 1970s: • 1977: DoD Computer Security Initiative • US Government and private concerns • National Bureau of Standards (NBS – now NIST) • Responsible for standards for acquisition and use of federal computing systems • Federal Information Processing Standards (FIPS PUBs)

  37. NBS • Two initiatives for security: • Cryptography standards • 1973: invitation for technical proposals for ciphers • 1977: Data Encryption Standard • 2001: Advanced Encryption Standard (NIST) • Development and evaluation processes for secure systems • Conferences and workshops • Involves researchers, constructors, vendors, software developers, and users • 1979: Mitre Corporation: entrusted to produce an initial set of criteria to evaluate the security of a system handling classified data

  38. National Computer Security Center • 1981: National Computer Security Center (NCSC) was established within NSA • To provide technical support and reference for government agencies • To define a set of criteria for the evaluation and assessment of security • To encourage and perform research in the field of security • To develop verification and testing tools • To increase security awareness in both federal and private sector • 1985: Trusted Computer System Evaluation Criteria (TCSEC) == Orange Book

  39. Orange Book • Orange Book objectives • Guidance of what security features to build into new products • Provide measurement to evaluate security of systems • Basis for specifying security requirements • Security features and Assurances • Trusted Computing Base (TCB) security components of the system: hardware, software, and firmware + reference monitor

  40. Orange Book Supply • Users: evaluation metrics to assess the reliability of the security system for protection of classified or sensitive information when • Commercial product • Internally developed system • Developers/vendors: design guide showing security features to be included in commercial systems • Designers: guide for the specification of security requirements

  41. Orange book • Set of criteria and requirements • Three main categories: • Security policy – protection level offered by the system • Accountability – of the users and user operations • Assurance – of the reliability of the system

  42. Security Policy • Concerns the definition of the policy regulation the access of users to information • Discretionary Access Control • Mandatory Access Control • Labels: for objects and subjects • Reuse of objects: basic storage elements must be cleaned before released to a new user

  43. Accountability • Identification/authentication • Audit • Trusted path: no users are attempting to access thr system fraudulently

  44. Assurance • Reliable hardware/software/firmware components that can be evaluated separately • Operation reliability • Development reliability

  45. Operation reliability • During system operation • System architecture: TCB isolated from user processes, security kernel isolated from non-security critical portions of the TCB • System integrity: correct operation (use diagnostic software) • Covert channel analysis • Trusted facility management: separation of duties • Trusted recovery: recover security features after TCB failures

  46. Development reliability • System reliable during the development process. Formal methods. • System testing: security features tested and verified • Design specification and verification: correct design and implementation wrt security policy. TCB formal specifications proved • Configuration management: configuration of the system components and its documentation • Trusted distribution: no unauthorized modifications

  47. Documentation • Defined set of documents • Minimal set: • Trusted facility manual • Security features user’s guide • Test documentation • Design documentation • Personnel info: Operators, Users, Developers, Maintainers

  48. Orange Book Levels Highest Security • A1 Verified protection • B3 Security Domains • B2 Structured Protection • B1 Labeled Security Protections • C2 Controlled Access Protection • C1 Discretionary Security Protection • D Minimal Protection No Security

  49. NCSC Rainbow Series • Orange: Trusted Computer System Evaluation Criteria • Yellow: Guidance for applying the Orange Book • Red: Trusted Network Interpretation • Lavender: Trusted Database Interpretation

  50. Evaluation Process • Preliminary technical review (PTR) • Preliminary technical report: architecture potential for target rating • Vendor assistance phase (VAP) • Review of the documentation needed for the evaluation process, e.g., security features user’s guide, trusted facility manual, design documentation, test plan. For B or higher, additional documentations are needed, e.g., covert channel analysis, formal model, etc. • Design analysis phase (DAP) • Initial product assessment report (IPAR): 100-200 pages, detailed info about the hardware, software architecture, security relevant features, team assessments, etc. • Technical Review Board • Recommendation to the NCSC

More Related