1 / 39

Psychology and Security

Psychology and Security. Agenda . Tuesday, June 28 th Psychology and Security Thursday, June 30 th Usable Security. References. Ross Anderson, Security Engineering Chapter 2 “Usability and Psychology” Ryan West, “The Psychology of Security”, Communications of the ACM, April 2008, p34-40.

joey
Télécharger la présentation

Psychology and Security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Psychology and Security

  2. Agenda • Tuesday, June 28th • Psychology and Security • Thursday, June 30th • Usable Security

  3. References • Ross Anderson, Security Engineering • Chapter 2 “Usability and Psychology” • Ryan West, “The Psychology of Security”, Communications of the ACM, April 2008, p34-40.

  4. People • Only amateurs attack machines; professionals target people.— Bruce Schneier • Many real attacks exploit psychology at least as much as technology. • Kevin Mitnick, Art of Deception

  5. Phishing • it is much easier for crooks to build a bogus bank website that passes casual inspection than it is for them to create a bogus bank in a shopping mall.

  6. Phishing Examples • US Bank • Amazon • Twitter

  7. Pretexting & Social Engineering • The most common way for private investigators to steal personal information is pretexting — phoning someone who has the information under a false pretext, usually by pretending to be someone authorized to be told it. Such attacks are sometimes known collectively as social engineering.

  8. Trusting people • Many frauds work by appealing to our atavistic instincts to trust people more in certain situations.

  9. Psychological manipulation • As designers learn how to forestall the easier techie attacks, psychological manipulation of system users or operators becomes ever more attractive. • The security engineer simply must understand basic psychology and ‘security usability’.

  10. IRS Social Engineering • Fixing the problem is hard. Despite continuing publicity about pretexting, there was an audit of the IRS in 2007 by the Treasury Inspector General for Tax Administration, whose staff called 102 IRS employees at all levels, asked for their user ids, and told them to change their passwords to a known value. 62 did so.

  11. Policies & Training • It’s not enough for rules to exist; you have to train all the staff who have access to the confidential material, and explain to them the reasons behind the rules.

  12. Research Areas • Information security and psychology • Human-computer interaction (HCI) • Poorly understood by systems developers • Information security and economics

  13. Perception of Risk • Terrorism is largely about manipulating perceptions of risk. • Many protection mechanisms are sold using scaremongering.

  14. Cognitive psychology • How we think, remember, and make decisions. • What makes security harder than safety is that we have a sentient attacker who will try to provoke exploitable errors.

  15. Practiced actions • People are trained to click ‘OK’ to pop-up boxes as that’s often the only way to get the work done.

  16. Risk Evaluation • Risk and uncertainty are extremely difficult concepts for people to evaluate. • For designers of security systems, it is important to understand how users evaluate and make decisions regarding security. • The most elegant and intuitively designed interface does not improve security if users ignore warnings, choose poor settings, or unintentionally subvert corporate policies.

  17. Risk Evaluation • The user problem in security systems is not just about user interfaces or system interaction. Fundamentally, it is about how people think of risk that guides their behavior.

  18. Following rules • Starting URLs with the impersonated bank’s name, as www.citibank.secureauthentication.com— looking for the name being for many people a stronger rule than parsing its position.

  19. Mental Model • Attackers exploit dissonances between users’ mental models of a system and its actual logic. • A cognitive walkthrough can be aimed at identifying attack points, just as a code walkthrough can be used to search for software vulnerabilities.

  20. Behavioral economics • People’s decision processes depart from the rational behavior. • The heuristics we use in everyday judgment and decision making lie somewhere between rational thought and the unmediated input from the senses.

  21. Calculating Probabilities • We’re also bad at calculating probabilities, and use all sorts of heuristics to help us make decisions: • We also worry too much about unlikely events. • Many people perceive terrorism to be a much worse threat than food poisoning or road traffic accidents.

  22. Problem 1 • Read “Users do not think they are at risk” on page 36 of Ryan West, “The Psychology of Security”. • Complete Problem 1

  23. Users aren’t stupid, they’re unmotivated • To conserve mental resources, we generally tend to favor quick decisions based on learned rules and heuristics. • It is efficient in the sense it is quick, it minimizes effort, and the outcome is good enough most of the time. (cognitive miser) • This partially accounts for why users do not reliably read all the text relevant in a display or consider all the consequences of their actions.

  24. Problem 2 • Safety is an abstract concept. • Chose a partner. • Complete Problem #2

  25. Evaluating the security/cost trade-off • While the gains of security are generally abstract the cost is real and immediate. • it usually comes with a price paid in time, effort, and convenience. • Users weigh the cost of the effort against the perceived value of the gain (safety/security) and the perceived chance that nothing bad would happen either way.

  26. Risk aversion • People dislike losing $100 they already have more than they value winning $100. • Marketers talk in terms of ‘discount’ and ‘saving’ — by framing an action as a gain rather than as a loss makes people more likely to take it.

  27. Problem 3 • Security as a secondary task. • Losses perceived disproportionately to gains • With your partner, complete Problem #3.

  28. Principle of Psychological Acceptability • Security Mechanisms should not make the resource more difficult to access than if the security mechanisms were not present. • Salzer & Schroeder 1975

  29. Principle of Psychological Acceptability • The security mechanism may add some extra burden, but that burden must be both minimal and reasonable. • Every file access requires the user enter his password?

  30. Password Policies • Many users want to use a simple easy to remember password. They do not want to change their password. They write down their password. They want to use the same password for all their accounts. • It is a challenge to write a password policy that is psychologically acceptable and still provides security.

  31. Airport Security • Is it psychologically acceptable? • How about full body scans and pat downs?

  32. IMPROVING SECURITY COMPLIANCE ANDDECISION MAKING • Reward pro-security behavior. • Users must be motivated to take pro-security actions. • There must be a tangible reward for making good security decisions. • One form of reward is to see that the security mechanisms are working and that the action the user chose is, in fact, making them safer.

  33. IMPROVING SECURITY COMPLIANCE ANDDECISION MAKING • When an antivirus or antispyware product finds and removes malicious code. The security application often issues a notification that it has found and mitigated a threat.

  34. Improve the awareness of risk • People often believe they are at less risk compared to others. • Increase user awareness of the risks they face. • Security messages should be instantly distinguishable from other message dialogs. Security messages should look and sound very different

  35. Catch corporate security policy violators • Having a corporate security policy that is not monitored or enforced is tantamount to having laws but no police. • Security systems should have good auditing capabilities. • The best deterrent to breaking the rules is not the severity of consequences but the likelihood of being caught.

  36. Reduce the cost of implementing security • To accomplish a task, users often seek the path of least resistance that satisfies the primary goal. • Making the secure choice the easiest for the user to implement, one takes advantage of normal user behavior and gains compliance.

  37. Reduce the cost of implementing security • To reduce the cost of security is to employ secure default settings. • Most users never change the default settings of their applications. • “Secure by Default” principle. • While good default settings can increase security, system designers must be careful that users do not find an easier way to slip around them.

  38. CONCLUSION • We can increase compliance if we work with the psychological principles that drive behavior.

  39. Problem #4 • Consider some software product that you regularly use, some website that you regularly visit, or some software product that you develop as part of your job. Briefly describe this product. • Discuss how well it meets the Principle of Psychological Acceptability for users of this product or website. • Discuss how this product or website could be improved from the psychological viewpoint.

More Related