1 / 30

Lecture 6 – Psychology: From Usability and Risk to Scams

Lecture 6 – Psychology: From Usability and Risk to Scams. Security Computer Science Tripos part 2 Ross Anderson. Usability and Psychology. ‘Why Johnny Can’t Encrypt’ – study of encryption program PGP – showed that 90% of users couldn’t get it right give 90 minutes

opa
Télécharger la présentation

Lecture 6 – Psychology: From Usability and Risk to Scams

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 6 – Psychology:From Usability and Risk to Scams Security Computer Science Tripos part 2 Ross Anderson

  2. Usability and Psychology • ‘Why Johnny Can’t Encrypt’ – study of encryption program PGP – showed that 90% of users couldn’t get it right give 90 minutes • Private / public, encryption / signing keys, plus trust labels was too much – people would delete private keys, or publish them, or whatever • Security is hard – unmotivated users, abstract security policies, lack of feedback … • Much better to have safe defaults (e.g. encrypt and sign everything) • But economics often push the other way …

  3. Usability and Psychology (2) • 1980s concerns with passwords: technical (crack /etc/passwd, LAN sniffer, retry counter) • 1990s concerns: weak defaults, attacks at point of entry (vertical ATM keypads), can the user choose a good password and not write it down? • Our 1998 password trial: control group, versus random passwords, versus passphrase • The compliance problem; and can someone who chooses a bad password harm only himself?

  4. Social Engineering • Use a plausible story, or just bully the target • ‘What’s your PIN so I can cancel your card?’ • NYHA case • Patricia Dunn case • Kevin Mitnick ‘Art of Deception’ • Traditional responses: • mandatory access control • operational security

  5. Social Engineering (2) • Social psychology: • Solomon Asch, 1951: two-thirds of subjects would deny obvious facts to conform to group • Stanley Milgram, 1964: a similar number will administer torture if instructed by an authority figure • Philip Zimbardo, 1971: you don’t need authority: the subjects’ situation / context is enough • The Officer Scott case • And what about users you can’t train (customers)?

  6. Phishing • Started in 2003 with six reported (there had been isolated earlier attacks on AOL passwords) • By 2006, UK banks lost £35m (£33m by one bank) and US banks maybe $200m • Early phish crude and greedy but phishermen learned fast • E.g. ‘Thank you for adding a new email address to your PayPal account’ • The banks make it easy for them – e.g. Halifax

  7. Phishing (2) • Banks pay firms to take down phishing sites • A couple have moved to two-factor authentication (CAP) – we’ll discuss later • At present, the phished banks are those with poor back-end controls and slow asset recovery • One gang (Rockphish) is doing half to two-thirds of the business • Mule recruitment seems to be a serious bottleneck

  8. Types of phishing website • Misleading domain name http://www.banckname.com/ http://www.bankname.xtrasecuresite.com/ • Insecure end user http://www.example.com/~user/www.bankname.com/ • Insecure machine http://www.example.com/bankname/login/ http://49320.0401/bankname/login/ • Free web hosting http://www.bank.com.freespacesitename.com/

  9. Rock-phish is different! • Compromised machines run a proxy • Domains do not infringe trademarks • name servers usually done in similar style • Distinctive URL style http://session9999.bank.com.lof80.info/signon/ • Some usage of “fast-flux” from Feb’07 onwards • viz: resolving to 5 (or 10…) IP addresses at once

  10. Mule recruitment • Proportion of spam devoted to recruitment shows that this is a significant bottleneck • Aegis, Lux Capital, Sydney Car Centre, etc • mixture of real firms and invented ones • some “fast-flux” hosting involved • Only the vigilantes are taking these down • impersonated are clueless and/or unmotivated • Long-lived sites usually indexed by Google

  11. Fake banks • These are not “phishing” • no-one takes them down, apart from the vigilantes • Usual pattern of repeated phrases on each new site, so googling finds more examples • sometimes old links left in (hand-edited!) • Sometimes part of a “419” scheme • inconvenient to show existence of dictator’s $millions in a real bank account! • Or sometimes part of a lottery scam

  12. Fraud and Phishing Patterns • Fraudsters do pretty well everything that normal marketers do • The IT industry has abandoned manuals – people learn by doing, and marketers train them in unsafe behaviour (click on links…) • Banks’ approach is ‘blame and train’ – long known to not work in safety critical systems • Their instructions ‘look for the lock’, ‘click on images not URLs’, ‘parse the URL’ are easily turned round, and discriminate against nongeeks

  13. Results • Ability to detect phishing is correlated with SQ-EQ • It is (independently) correlated with gender • So the gender HCI issue applies to security too

More Related