1 / 30

CSC 482/582: Computer Security

CSC 482/582: Computer Security. Software Security. Topics. The problem of software security System security standards Secure development lifecycle Simple Web Server Economics of Security. Traditional Security is Reactive. Perimeter defense (firewalls) Intrusion detection

loranger
Télécharger la présentation

CSC 482/582: Computer Security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSC 482/582: Computer Security Software Security

  2. Topics • The problem of software security • System security standards • Secure development lifecycle • Simple Web Server • Economics of Security

  3. Traditional Security is Reactive • Perimeter defense (firewalls) • Intrusion detection • Over-reliance on cryptography • Penetrate and patch • Penetration testing

  4. The Problem is Software “75% of hacks happen at the application.” - Theresa Lanowitz, Gartner Inc. “92% of reported vulnerabilities are in apps, not networks.” - NIST “64% of developers are not confident in their ability to write secure code.” - Bill Gates

  5. Why is most software insecure? • Security is seen as something that gets in the way of software functionality. • Security is difficult to assess and quantify. • Security is often not a primary skill or interest of software developers. • Time spent on security is time not spent on adding new and interesting functionality. • Characteristics of modern software: the trinity of trouble. CSC 482/582: Computer Security

  6. Trinity of Trouble Connectivity • Ubiquitous Internet; wireless & mobile computing. Complexity • Networked, distributed code that can interact with intermediate caches, ad proxies, etc. Extensibility • Systems evolve in unexpected ways, e.g. web browsers, which support many formats, add-ons, plugins, programming languages, etc.

  7. Software Security Objectives 1. Dependability: software functions only as intended; • Trustworthiness: No exploitable vulnerabilities or malicious logic exist in the software; 3. Resilience: If compromised, damage will be minimized, and it will recover quickly to an acceptable level of operating capacity; 4. Conformance: to requirements and applicable standards and procedures.

  8. System Security Certifications Software system security certifications • Orange Book • ISO 15408 Common Criteria • PCI Data Security Standard (PCI DSS) Many standards indirectly impact SSE • FISMA (supersedes CSA of 1987) • HIPAA (health information privacy) • Gramm–Leach–Bliley Act (GLB) • SOX (Sarbanes-Oxley)

  9. Orange Book Trusted Computer System Eval Criteria • Issue in 1983 by NSA. • Replaced by Common Criteria in 2005. System classification levels D: failed evaluation for higher level classification C: discretionary protection • Authentication, DAC, basic auditing B: mandatory protection • MAC, security policy requirements, auditing incl covert channels A: verified protection • Formal specification and design techniques with proofs

  10. Common Criteria ISO standard 15408 • International standard, used by US and others. Protection Profile (PP) • Description of system, e.g. anti-virus, firewall, smartcard Evaluation Assurance Level (EAL)

  11. PCI DSS PCI Data Security Standard (PCI DSS) • Proprietary security standard for organizations that handle card payment information for major credit and debit card companies. Requirements • Network security • Cardholder data security • Access control and auditing • Security policy • Vulnerability management, and • Develop and maintain secure systems and applications

  12. Secure Development Processes • CLASP (Comprehensive, Lightweight Application Security Process) • Correctness-by-Construction (formal methods based process from Praxis Critical Systems) • MS SDL (Microsoft Secure Development Lifecycle) • SSE CMM (Secure Software Engineering Capability Maturity Model) • TSP-Secure (Team Software Process for Secure Software Development) • Touchpoints

  13. AbuseCases Risk Analysis Code Reviews + Static Analysis Security Testing Penetration Testing Security Operations Requirements Design Coding Testing Maintenance Security Development Lifecycle • Security Testing • Abuse Cases • Security Operations • Code Reviews • Risk Analysis • Penetration Testing

  14. Code Reviews A code review is an examination of source code by developers other than the author to find defects. Benefits • Find defects sooner in the lifecycle. • Find defects with less effort than testing. • Find different defects than testing. • Educate developers about vulnerabilities. Static analysis tools can assist in finding security bugs.

  15. Security in Design • Apply secure design principles throughout design process, such as • Least Privilege • Fail-Safe Defaults • Defense in Depth • Separation of Privilege • Use secure design patterns where applicable. • Perform an architectural risk analysis to evaluate the security of your design and to identify design changes that need to be made to improve security.

  16. Designing-In Security Design features with security in mind • Not as an afterthought • Hard to “add-on” security later Define concrete, measurable security goals. Ex: • Only certain users should be able to do X. Log action. • Output of feature Y should be encrypted. • Feature Z should be available 99.9% of the time Bad Examples: Windows 98, Internet

  17. The Internet All nodes originally university or military (i.e. trusted) since it grew out of DARPA. • Security architecture of Internet designed for trusted env. With commercialization, lots of new hosts, all allowed to connect to existing hosts regardless of whether they were trusted. • Attempted to backport security with firewalls. • Firewalls let in only traffic from trusted IPs to trusted ports. • But can be bypassed by spoofed IPs, tunneling via allowed ports, reverse-shells, etc.

  18. Architectural Risk Analysis Fix design flaws, not implementation bugs. Risk analysis steps • Develop an architecture model. • Identify threats and possible vulnerabilities. • Develop attack scenarios. • Rank risks based on probability and impact. • Develop mitigation strategy. • Report findings

  19. Black Box Testing Advantages of Black Box Testing • Examines system as an outsider would. • Tester builds understanding of attack surface and system internals during test process. • Can use to evaluate effort required to attack system. • Helps test items that aren’t documented. System Test Input Test Output

  20. White and Grey Box Testing White Box • Tester knows all information about system. • Including source code, design, requirements. • Most efficient technique. • Avoids security through obscurity. Grey Box • Apply both white box and black box techniques. Test Input Test Output

  21. Penetration Testing Black box test of deployed system. Allocate time at end of development to test. • Often time-boxed: test for n days. • Schedule slips often reduce testing time. • Fixing flaws is expensive late in lifecycle. Penetration testing tools • Test common vulnerability types against inputs. • Fuzzing: send random data to inputs. • Don’t understand application structure or purpose.

  22. Security Testing Injection flaws, buffer overflows, XSS, etc. Functional testing will find missing functionality. Intendended Functionality Actual Functionality

  23. Security Testing Two types of testing Functional: verify security mechanisms. Adversarial: verify resistance to attacks generated during risk analysis. Different from traditional penetration testing • White box. • Use risk analysis to build tests. • Measure security against risk model.

  24. Security in Software Requirements Robust, consistent error handling always needed. Share requirements w/ QA team for security testing. Handle internal errors securely – don’t provide useful information via error messages to attackers! “Security or Bust” Policy Microsoft delayed ship of .NET server in ’02 because security requirements not met by “code freeze” deadline

  25. Measurable Security Requirements Access Control Security: Only certain users can perform specified functions Auditing: Maintain log of users’ sensitive actions Confidentiality: encrypt certain functions’ output Availability: Certain features should be available almost always (99.99%) or within specified time Include these requirements in design docs!

  26. Abuse Cases Anti-requirements Think about what software should not do. A use case from an adversary’s point of view. • Obtain Another User’s CC Data. • Alter Item Price. • Deny Service to Application. Developing abuse cases Informed brainstorming: attack patterns, risks.

  27. Security Operations User security notes • Software should be secure by default. • Enabling certain features may have risks. • User needs to be informed of security risks. Incident response • What happens when a vulnerability is reported? • How do you communicate with users? • How do you send updates to users?

  28. Economics of Security How much does it cost to break a system? • All systems have vulnerabilities. • How much time, expertise, & resources needed to exploit? If value of assets > cost to break system, then • Threats motivated to conduct a successful attack. • Defenders need to invest in more security to raise the cost of breaking system or need to remove assets.

  29. Key Points • Problem of software security • Trinity of trouble: connectivity, complexity, extensibility • System security certifications • Orange book, Common criteria, PCI DSS • Security lifecycle • Abuse cases • Architectural risk analysis • Code reviews • Penetration and security testing • Operations • Economics of security • If value of assets > cost to break, then threats motivated to take assets.

  30. References • Aleph Null, “Smashing the Stack for Fun and Profit,” Phrack 49, 1996. • Brian Chess and Jacob West, Secure Programming with Static Analysis, Addison-Wesley, 2007. • Goodrich and Tammasia, Introduction to Computer Security, Pearson, 2011. • Koziol, et. al, The Shellcoder’s Handbook: Discovering and Exploiting Security Holes, Wiley, 2004. • Robert C. Seacord, Secure Coding in C and C++, Addison-Wesley, 2006. • John Viega and Gary McGraw, Building Secure Software, Addison-Wesley, 2002. • David Wheeler, Secure Programming for UNIX and Linux HOWTO, http://www.dwheeler.com/secure-programs/Secure-Programs-HOWTO/index.html, 2003.

More Related