350 likes | 569 Vues
Software Security and Security Engineering (Part 2). Software Engineering Sources: Ian Somerville, Software Engineering, Chapter 14 Computer Security: Arts and Science, Matt Bishop, Addison Wesley, 2003 Chapter 28. Security Principals. 1975 Saltzer and Schroeder’s
E N D
Software Security and Security Engineering (Part 2) Software Engineering Sources: Ian Somerville, Software Engineering, Chapter 14 Computer Security: Arts and Science, Matt Bishop, Addison Wesley, 2003 Chapter 28
Security Principals • 1975 Saltzer and Schroeder’s fundamental principles of secure software design • Main goal: restriction with simplicity
Least Privilege • A subject should be given only those privileges necessary to complete its task • Function controls assignment • Minimal protection domain • Rights added as needed, discarded after use
Fail-Safe Defaults • Default action is to deny access • Exclude-fail is better than permit-fail • If unable to complete task, undo
Economy of Mechanism • Keep it as simple as possible • Simpler means less can go wrong
Complete Mediation • Check every access • Everytime • No bypass
Open Design • Strength of security should not depend on secrecy of design or implementation • Does not apply to information such as passwords or cryptographic keys
Separation of Privilege • Require multiple conditions to grant privilege/access • Separation of duty
Least Common Mechanism • Mechanisms/Resources should not be shared • Information can flow along shared channels
Psychological Acceptability • Security mechanisms should not add to difficulty of accessing resource • Hide complexity introduced by security mechanisms • Ease of installation, configuration, use
Mapping of Design guidelines with SS Principals • Economy of Mechanism • Open Design • Complete Mediation • Fail Safe Defaults • Separation of Privilege • Least Privilege • Least Common Mechanism • Psychological Acceptability
Design guidelines 1-3 • Base decisions on an explicit security policy • Define a security policy for the organization that sets out the fundamental security requirements that should apply to all organizational systems. • Avoid a single point of failure • Ensure that a security failure can only result when there is more than one failure in security procedures. For example, have password and question-based authentication. • Fail securely • When systems fail, for whatever reason, ensure that sensitive information cannot be accessed by unauthorized users even although normal security procedures are unavailable.
Design guidelines 4-6 • Balance security and usability • Try to avoid security procedures that make the system difficult to use. Sometimes you have to accept weaker security to make the system more usable. • Log user actions • Maintain a log of user actions that can be analyzed to discover who did what. If users know about such a log, they are less likely to behave in an irresponsible way. • Use redundancy and diversity to reduce risk • Keep multiple copies of data and use diverse infrastructure so that an infrastructure vulnerability cannot be the single point of failure.
Design guidelines 7-10 • Validate all inputs • Check all inputs so that unexpected inputs cannot cause problems. • Compartmentalize your assets • Organize the system so that assets are in separate areas and users only have access to the information that they need rather than all system information. • Design for deployment • Design the system to avoid deployment problems • Design for recoverability • Design the system to simplify recoverability after a successful attack.
Common security-related programming problems • Improper choice of initial protection domain • Improper isolation of implementation detail • Improper change in file contents • Improper naming • Improper deallocation, deletion • Improper validation • Improper indivisibility • Improper sequencing • Improper choice of operand or operation
Improper Choice of Initial Protection Domain • Arise from incorrect setting of permissions or privileges • Process privileges • Assumptions • Memory protection
Process Privileges • Least privilege principle • Implementation Rule: • Structure the process so that all sections requiring extra privileges are modules. • The modules should be as small as possible and should perform only those tasks that require those privileges. • Example: • When implementing authentication module • privileges acquired only when needed, and relinquished once immediate task is complete
Assumptions • Incorrect or flawed assumption can open opportunity for exploitation • Implementation Rule: • Ensure that any assumptions in the program are validated. • If this is not possible, document them for the installers and maintainers, so they know the assumptions that attackers will try to invalidate. • Example: • Document importance of integrity control for the password file that is used by the authentication module
Memory Protection • Shared memory • If two processes have access, one can change data other relies upon, or read data other considers secret • Least common mechanism principle • Implementation Rule • Ensure that the program does not share objects in memory with any other program, and that other programs cannot access the memory of a privileged process. • Example • Declare variables that should not be changed in program as const
Memory Protectioncont. • Implementation Rule: • If a process interacts with other processes, the interactions should be synchronized. In particular, all possible sequences of interactions must be known and, for all such interactions, the process must enforce the required security policy.
Memory Protection cont. • Executable memory • With Buffer Overflowattack (active learning exercise), one can overflow buffer to overwrite memory with malicious code, which when executed could cause potential harm. • Least privilege principle • Implementation Rule • Whenever possible, data that the process trusts and data that it receives from untrusted sources (such as input) should be kept in separate areas of memory. If data from a trusted source is overwritten with data from an untrusted source, a memory error should occur • Check size of user input • Treat user input as untrusted data (not executable code) • Check that values are valid • Do not reuse variables used for data input • Any out-of-bounds reference should invoke exception handler
Improper Isolation of Implementation Detail • Look for errors, failures of mapping from abstraction to implementation • Usually error messages should capture these • Fail safe default principle • Implementation Rule: • The error status of every function must be checked. • Do not try to recover unless the cause of the error, and its effects, do not affect any security considerations. The program should restore the state of the system to the state before the process began, and then terminate. • Example: • Carefully thought out error messages • When authentication fails for some reason, do not give out exactly why.. aids attackers • If DB query fails, do not give out error messages with info on DB schema • Not falling back to default with unexpected situation
Improper Change in File Contents • May happen when multiple processes have access to same file or when dynamic libraries are used • Implementation Rule: • Do not use components that may change between the time the program is created and the time it is run. • Example • Dynamic libraries
Improper Naming • Ambiguity in identifying object name • Names are interpreted in context • Unique objects cannot share names within same context • Context includes: • Character set composing name • Process, file hierarchies • Network domains • Customizations such as search path • Implementation Rule: • The process must ensure that the context in which an object is named identifies the correct object. • Example: • Use of absolute path names like c:/secret/names.txt instead of names.txt
Improper Deallocation, Deletion • Sensitive information can be exposed if object containing it is reallocated • Erase data, then deallocate • Implementation Rule: • When the process finishes using a sensitive object (one that contains confidential information or one that should not be altered), the object should be erased, then deallocated or deleted. Any resources not needed should also be released.
Improper Validation • Something not checked for consistency or correctness • Bounds checking • Type checking
Bounds Checking • Improper bounds results in overwriting • Implementation Rule: • Ensure that all array references access existing elements of the array. If a function that manipulates arrays cannot ensure that only valid elements are referenced, do not use that function. Find one that does, write a new version. • Example: • strcpynever checks bounds; too dangerous • Use strncpy where you can specify bound
Type Checking • Ensure arguments, inputs, and such are of the right type • Interpreting floating point as integer, or shorts as longs will generate erroneous results • Implementation Rule: • Check the types of functions and parameters.
Designing for Validation • Some validations are impossible due to structure of language or other factors • Example: in C, one can test for NULL pointer, but not for a “valid” pointer • So avoid situations where pointers are passed and must be validated • Implementation Rule: • Create data structures and functions in such a way that they can be validated.
Improper Indivisibility • Problems arise when operations that should be indivisible are divisible • Separation of privilege principle • Implementation Rule: • If two operations must be performed sequentially without an intervening operation, use a mechanism to ensure that the two cannot be divided.
Improper Sequencing • Operations performed in incorrect order can produce undesired results (active learning exercise) • Implementation Rule: • Describe the legal sequences of operations on a resource or object. Check that all possible sequences of the program(s).
Improper Choice ofOperand or Operation • Erroneous selection of operation or operand (active learning exercise) • Implementation Rule: • Validate if operands or operations are yielding desired behavior as per security policy.
Qu es ti ons? The End ______________________ Devon M. Simmonds Computer Science Department University of North Carolina Wilmington ____________________________________________ _________________