1 / 68

EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability

EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability. Yongdae Kim. Recap. http://security101.kr E-mail policy Include [ee515] or [is523] in the subject of your e-mail Student Survey http://bit.ly/SiK9M3 Student Presentation Send me email.

Télécharger la présentation

EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EE515/IS523 Think Like an AdversaryLecture 6Access Control/Usability Yongdae Kim

  2. Recap • http://security101.kr • E-mail policy • Include [ee515] or [is523] in the subject of your e-mail • Student Survey • http://bit.ly/SiK9M3 • Student Presentation • Send me email. • Preproposal meeting: Today after class

  3. Kerberos vs. PKIvs. IBE • Still debating  • Let’s see one by one!

  4. A, B, NA EKBT(k, A, L), EKAT(k, NA, L, B) EKBT(k, A, L), Ek(A, TA, Asubkey) Ek(TA, Bsubkey) Kerberos (cnt.) T • EKBT(k, A, L): Token for B • EKAT(k, NA, L, B): Token for A • L: Life-time • NA? • Ek(A, TA, Asubkey): To prove B that A knows k • TA: Time-stamp • Ek(B, TA, Bsubkey): To prove A that B knows k B A

  5. EKAG(kAB, NA’, L, B), EkGB(kAB, A, L, NA’), B, NA’ EKGT(kAG, A, L), EKAT(kAG, NA, L, G) A, G, NA EKGT(kAG, A, L), EkAG(A, TA), B, NA’ EKGB (kAB, A, L, NA’), EkAB(A, TA’, Asubkey) Ek(TA’, Bsubkey) Kerberos (Scalable) T (AS) G (TGS) B A

  6. Public Key Certificate • Public-key certificates are a vehicle • public keys may be stored, distributed or forwarded over unsecured media • The objective • make one entity’s public key available to others such that its authenticity and validity are verifiable. • A public-key certificate is a data structure • data part • cleartext data including a public key and a string identifying the party (subject entity) to be associated therewith. • signature part • digital signature of a certification authority over the data part • binding the subject entity’s identity to the specified public key.

  7. CA • a trusted third party whose signature on the certificate vouches for the authenticity of the public key bound to the subject entity • The significance of this binding must be provided by additional means, such as an attribute certificate or policy statement. • the subject entity must be a unique name within the system (distinguished name) • The CA requires its own signature key pair, the authentic public key. • Can be off-line!

  8. ID-based Cryptography • No public key • Public key = ID (email, name, etc.) • PKG • Private key generation center • SKID = PKGS(ID) • PKG’s public key is public. • distributes private key associated with the ID • Encryption: C= EID(M) • Decryption: DSK(C) = M

  9. Discussion (PKI vs. Kerberos vs. IBE) • On-line vs. off-line TTP • Implication? • Non-reputation? • Revocation? • Scalability? • Trust issue?

  10. OS Security • OS Security is essentially concerned with four problems: • User authentication links users to processes. • Access control is about deciding whether a process can access a resource. • Protection is the task of enforcing these decisions: ensuring a process does not access resources improperly. • Isolation is the separation of processes’ resources from other processes.

  11. Access Control • The OS mediates access requests between subjects and objects. • This mediation should (ideally) be impossible to avoid or circumvent. ? Object Subject Reference monitor

  12. Definitions • Subjects make access requests on objects. • Subjects are the ones doing things in the system, like users, processes, and programs. • Objects are system resources, like memory, data structures, instructions, code, programs, files, sockets, devices, etc… • The type of access determines what to do to the object, for example execute, read, write, allocate, insert, append, list, lock, administer, delete, or transfer

  13. Access Control • Discretionary Access Control: • Access to objects (files, directories, devices, etc.) is permitted based on user identity • Each object is owned by a user. • Owners can specify freely (at their discretion) how they want to share their objects with other users, • by specifying which other users can have which form of access to their objects. • Discretionary access control is implemented on any multi-user OS (Unix, Windows NT, etc.). • Mandatory Access Control: • Access to objects is controlled by a system-wide policy • for example to prevent certain flows of information. • In some forms, the system maintains security labels for both objects and subjects • based on which access is granted or denied. • Labels can change as the result of an access • Security policies are enforced without the cooperation of users or application programs. • Mandatory access control for Linux: http://www.nsa.gov/research/selinux/

  14. Access Control Matrix

  15. Representations • An access control matrix can be represented internally in different ways: • Access Control Lists (ACLs) store the columns with the objects • Capability lists store the rows with the subjects • Role-based systems group rights according to the “role” of a subject.

  16. Access Control Lists • The ACL for an object lists the access rights of each subject (usually users). • To check a request, look in the object’s ACL. • ACLs are used by most OSes and network file systems, e.g. NT, Unix, and AFS.

  17. ACL Problems • To be secure, the OS must authenticate that the user is who (s)he claims to be. • To revoke a user’s access, we must check every object in the system. • There is often no good way to restrict a process to a subset of the user’s rights.

  18. Capabilities • Capabilities store the allowed list of object accesses with each subject. • When the subject requests access to object O, it must provide a “ticket” granting access to O. • These tickets are stored in an OS-protected table associated to each process. • No widely-used OS uses pure capabilities. • Some systems have “capability-like” features: e.g. Kerberos, NT, OLPC, Android

  19. ACL vs. Capabilities • Capabilities do not require authentication: the OS just checks each ticket on access requests. • Capabilities can be passed, or delegated, from one process to another. • We can limit the privileges of a process, by removing unnecessary tickets from the table.

  20. Roles … … S1 S2 S3 Sm S1 S2 S3 Sm R1 R2 O1 O2 … On O1 O2 … On

  21. Unix/POSIX Access Control kyd@dio (~) % id uid=3259(kyd) gid=717(faculty) groups=717(faculty),1686(mess),1847(S07C8271),1910(F07C5471),2038(S08C8271) kyd@dio (~) % ls -l News_and_Recent_Events.zip -rw-rw-rw- 1 kyd faculty 714904 Feb 22 10:00 News_and_Recent_Events.zip kyd@dio (/web/classes02/Spring-2011/csci5471) % ls –al drwxrwsr-x 4 kyd S11C5471 512 Jan 19 10:23 ./ drwxr-xr-x 46 root daemon 1024 Feb 17 23:04 ../ drwxrwsr-x 3 kyd S11C5471 512 Feb 16 00:36 Assignment/

  22. Mandatory Access Control policies • Restrictions to allowed information flows are not decided at the user’s discretion (as with Unix chmod), but instead enforced by system policies. • Mandatory access control mechanisms are aimed in particular at preventing policy violations by untrusted application software, which typically have at least the same access privileges as the invoking user.

  23. Data Pump/Data Diode • Like “air gap” security, but with one-way communication link that allow users to transfer data from the low-confidentiality to the high- confidentiality environment, but not vice versa. • Examples: • Workstations with highly confidential material are configured to have read-only access to low confidentiality file servers.

  24. The covert channel problem • Reference monitors see only intentional communications channels, such as files, sockets, memory. • However, there are many more “covert channels”, which were neither designed nor intended to transfer information at all. • A malicious high-level program can use these to transmit high-level data to a low-level receiving process, who can then leak it to the outside world. • Examples for covert channels: • Resource conflicts – If high-level process has already created a file F, a low-level process will fail when trying to create a file of same name → 1 bit information. • Timing channels – Processes can use system clock to monitor their own progress and infer the current load, into which other processes can modulate information. • Resource state – High-level processes can leave shared resources (disk head position, cache memory content, etc.) in states that influence the service response times for the next process. • Hidden information in downgraded documents – Steganographic embedding techniques can be used to get confidential information past a human downgrader (least-significant bits in digital photos, variations of punctuation/spelling/whitespace in plaintext, etc.).

  25. User Interface Failures

  26. Humans “Humans are incapable of securely storing high-quality cryptographic keys, and they have unacceptable speed and accuracy when performing cryptographic operations. (They are also large, expensive to maintain, difficult to manage, and they pollute the environment. It is astonishing that these devices continue to be manufactured and deployed. But they are sufficiently pervasive that we must design our protocols around their limitations.)” −− C. Kaufman, R. Perlman, and M. Speciner. Network Security: PRIVATE Communication in a PUBLIC World.2nd edition. Prentice Hall, page 237, 2002.

  27. Humans are weakest link • Most security breaches attributed to “human error” • Social engineering attacks proliferate • Frequent security policy compliance failures • Automated systems are generally more predictable and accurate than humans

  28. Why are humans in the loop at all? • Don’t know how or too expensive to automate • Human judgments or policy decisions needed • Need to authenticate humans

  29. The human threat • Malicious humans who will attack system • Humans who are unmotivated to perform security-critical tasks properly or comply with policies • Humans who don’t know when or how to perform security-critical tasks • Humans who are incapable of performing security-critical tasks

  30. Need to better understand humans in the loop • Do they know they are supposed to be doing something? • Do they understand what they are supposed to do? • Do they know how to do it? • Are they motivated to do it? • Are they capable of doing it? • Will they actually do it?

  31. SSL Warnings

  32. False Alarm Effect • “Detection system” ≈ “System” • If risk is not immediate, warning the user will decrease her trust on the system

  33. Patco Construction vs. Ocean Bank • Hacker stole ~$600K from Patco through Zeus • The transfer alarmed the bank, but ignored • “substantially increase the risk of fraud by asking for security answers for every $1 transaction” • “neither monitored that transaction nor provided notice before completed” • “commercially unreasonable” • Out-of-Band Authentication • User-Selected Picture • Tokens • Monitoring of Risk-Scoring Reports

  34. Password Authentication

  35. Definitions • Identification - a claim about identity • Who or what I am (global or local) • Authentication - confirming that claims are true • I am who I say I am • I have a valid credential • Authorization - granting permission based on a valid claim • Now that I have been validated, I am allowed to access certain resources or take certain actions • Access control system - a system that authenticates users and gives them access to resources based on their authorizations • Includes or relies upon an authentication mechanism • May include the ability to grant course or fine-grained authorizations, revoke or delegate authorizations • Also includes an interface for policy configuration and management

  36. Building blocks of authentication • Factors • Something you know (or recognize) • Something you have • Something you are • Two factors are better than one • Especially two factors from different categories • What are some examples of each of these factors? • What are some examples of two-factor authentication?

  37. Authentication mechanisms • Text-based passwords • Graphical passwords • Hardware tokens • Public key crypto protocols • Biometrics

  38. Evaluation • Accessibility • Memorability • Security • Cost • Environmental considerations

  39. Typical password advice

  40. Typical password advice • Pick a hard to guess password • Don’t use it anywhere else • Change it often • Don’t write it down So what do you do when every web site you visit asks for a password?

  41. Bank = b3aYZ Amazon = aa66x! Phonebill = p$2$ta1

  42. Problems with Passwords • Selection • Difficult to think of a good password • Passwords people think of first are easy to guess • Memorability • Easy to forget passwords that aren’t frequently used • Difficult to remember “secure” passwords with a mix of upper & lower case letters, numbers, and special characters • Reuse • Too many passwords to remember • A previously used password is memorable • Sharing • Often unintentional through reuse • Systems aren’t designed to support the way people work together and share information

  43. Substitute numbers for words or similar-looking letters fsasya,oF Substitute symbols for words or similar-looking letters 4sa7ya,oF Mnemonic Passwords Four F Four and and a , , score s y years seven s seven a ago o our F Fathers First letter of each word (with punctuation) 4sa7ya,oF 4sasya,oF 4s&7ya,oF Source: Cynthia Kuo, SOUPS 2006

  44. The Promise? • Phrases help users incorporate different character classes in passwords • Easier to think of character-for-word substitutions • Virtually infinite number of phrases • Dictionaries do not contain mnemonics Source: Cynthia Kuo, SOUPS 2006

  45. Mnemonic password evaluation • Mnemonic passwords are not a panacea for password creation • No comprehensive dictionary today • May become more vulnerable in future • Many people start to use them • Attackers incentivized to build dictionaries • Publicly available phrases should be avoided! Source: Cynthia Kuo, SOUPS 2006

  46. Password keeper software • Run on PC or handheld • Only remember one password

  47. Single sign-on • Login once to get access to all your passwords

  48. Biometrics

More Related