1 / 60

A case study in UI design and evaluation for computer security

A case study in UI design and evaluation for computer security. Rob Reeder Dependable Systems Lab Carnegie Mellon University February 16, 2006. Memogate : A user interface scandal !!. Overview. Task domain: Windows XP file permissions

milo
Télécharger la présentation

A case study in UI design and evaluation for computer security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A case study in UI design and evaluation for computer security Rob Reeder Dependable Systems Lab Carnegie Mellon University February 16, 2006

  2. Memogate: A user interface scandal !!

  3. Overview • Task domain: Windows XP file permissions • Design of two user interfaces: native XP interface, Salmon • Evaluation: Which interface was better? • Analysis: Why was one better?

  4. Part 1: File permissions in Windows XP • File permissions task: Allow authorized users access to objects, deny unauthorized users access to objects • Objects: Files and folders • Users: People with accounts on the system • Access: 13 types, such as Read Data, Write Data, Execute, Delete

  5. Challenges for file permissions UI design • Maybe thousands of users – impossible to set permissions individually for each • Thirteen access types – hard for a person to remember them all

  6. Grouping to handle users • Administrators • Power Users • Everyone • Admin-defined

  7. A problematic user grouping Xu Ari Miguel Bill Yasir Cindy Zack Group A Group B

  8. Precedence rules • No setting = Deny by default • Allow > No setting • Deny > Allow (> means “takes precedence over”)

  9. Grouping to handle access types

  10. Moral • Setting file permissions is quite complicated • But a good interface design can help!

  11. The XP file permissions interface

  12. ProjectF The Salmon interface

  13. Example task: Wesley • Initial state • Wesley allowed READ & WRITE from a group • Final state • Wesley allowed READ , denied WRITE • What needs to be done • Deny Wesley WRITE

  14. What’s so hard? • Conceptually: Nothing! • Pragmatically: • User doesn’t know initial group membership • Not clear what changes need to be made • Checking work is hard

  15. Learning Wesley’s initial permissions 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions

  16. Learning Wesley’s group membership Bring up Computer Management interface 5 6 Click on “Users” 7 Read Wesley’s group membership Double-click Wesley Click “Member Of” 8 9

  17. Changing Wesley’s permissions 10 11 Deny Write Click “Add…” 12 Click “Apply”

  18. Checking work 13 14 Click “Effective Permissions” Click “Advanced” 15 16 Select Wesley View Wesley’s Effective Permissions

  19. XP file permissions interface: Poor

  20. Part 2: Common security UI design problems • Poor feedback • Ambiguous labels • Violation of conventions • Hidden options • Omission errors

  21. Problem #1: Poor feedback 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions

  22. ProjectF Salmon: immediate feedback

  23. Full control Modify Read & Execute Read Write Special Permissions Problem #2: Labels (1/3)

  24. Full control Traverse Folder/Execute File List Folder/Read Data Read Attributes Read Extended Attributes Create Files/Write Data Create Folders/Append Data Write Attributes Write Extended Attributes Delete Read Permissions Change Permissions Take Ownership Problem #2: Labels (2/3)

  25. ProjectF Salmon: clearer labels

  26. Problem #3: Violating interface conventions Normally, a clicking on a checkbox only changes that checkbox –but click a checkbox in one of these sets, and some other checkboxes may also be changed – confusing!

  27. ProjectF Salmon: better checkboxes

  28. Problem #4: Hidden options

  29. Problem #4: Hidden options 1 2 Double-click entry Click “Advanced” 3 Click “Delete” checkbox

  30. ProjectF Salmon: All options visible

  31. Problem #5: Omission errors (1/2) • Omission error: Failure to complete a necessary step in a procedure • Classic examples: • Forgetting to take your card out of the ATM after receiving cash • Forgetting to take your original after making photocopies • Omission errors are quite common in security-based tasks • Forgetting to change a default password • Forgetting to restart a service after making changes

  32. Problem #5: Omission errors (2/2) • XP interface showed much potential for omission errors: • Users failed to make necessary changes to permissions that were hidden from view (e.g. Change Permissions and Take Ownership) • User failed to check group membership, because the group membership information was so hard to find

  33. ProjectF Salmon: Feedback helps prevent omission errors

  34. FLOCK: Summary of design problems • Feedback poor • Labels ambiguous • Omission error potential • Convention violation • Keeping options visible

  35. Part 3: Evaluation of XP and Salmon • Conducted laboratory-based user studies • Formative and summative studies for Salmon • I’ll focus on summative evaluation

  36. Advice for user studies • Know what you’re measuring! • Maintain internal validity • Maintain external validity

  37. Common usable security metrics • Accuracy – with what probability do users correctly complete tasks? • Speed – how quickly can users complete tasks? • Security – how difficult is it for an attacker to break into the system? • Etc. – satisfaction, learnability, memorability

  38. Measure the right thing! Keystroke dynamics analysis poses a real threat to any computer user. Hackers can easily record the sounds of users' keystrokes and obtain sensitive passwords from them. We address this issue by introducing a new typing method we call "Babel Type", in which users hit random keys when asked to type in their passwords. We have built a prototype and tested it on 100 monkeys with typewriters. We discovered that our method reduces the keystroke attack by 100%. This approach could potentially eliminate all risks associated with keystroke dynamics and increase user confidence. It remains an open question, however, how to let these random passwords authenticate the users.

  39. Measurement instruments • Speed – Easy; use a stopwatch, time users • Accuracy – Harder; need unambiguous definitions of “success” and “failure” • Security – Very hard; may require serious math, or lots of hackers

  40. Internal validity • Internal validity: Making sure your results are due to the effect you are testing • Manipulate one variable (in our case, the interface, XP or Salmon) • Control or randomize other variables • Use same experimenter • Experimenter reads directions from a script • Tasks presented in same text to all users • Assign tasks in different order for each user • Assign users randomly to one condition or other

  41. External validity • External validity: Making sure your experiment can be generalized to the real world • Choose real tasks • Sources of real tasks: • Web forums • Surveys • Your own experience • Choose real participants • We were testing novice or occasional file-permissions users with technical backgrounds (so CMU students & staff fit the bill)

  42. User study compared Salmon to XP • Seven permissions-setting tasks, I’ll discuss two: • Wesley • Jack • Metrics for comparison: • Accuracy (measured as deviations in users’ final permission bits from correct permission bits) • Speed (time to task completion) • Not security – left that to Microsoft

  43. Study design • Between-participants comparison of interfaces • 12 participants per interface, 24 total • Participants were technical staff and students at Carnegie Mellon University • Participants were novice or occasional file permissions users

  44. Wesley and Jack tasks Wesley task Jack task • Initial state • Jack allowed READ, WRITE, & ADMINISTRATE • Final state • Jack allowed READ, denied WRITE & ADMINISTRATE • What needs to be done • Deny Jack WRITE & ADMINISTRATE • Initial state • Wesley allowed READ & WRITE • Final state • Wesley allowed READ, denied WRITE • What needs to be done • Deny Wesley WRITE

  45. 300% improvement 43% improvement Salmon outperformed XP in accuracy Salmon Salmon XP XP

  46. Salmon outperformed XP in accuracy p = 0.09 p < 0.0001 Salmon Salmon XP XP

  47. Salmon did not sacrifice speed XP XP Salmon Salmon

  48. Salmon did not sacrifice speed p = 0.35 p = 0.20 XP XP Salmon Salmon

  49. Part 4: Analysis • What led Salmon users to better performance?

  50. How users spent their time - Wesley

More Related