1 / 64

Heuristic Evaluation

Loren Terveen CS 5115, Fall 2010 November 1. Heuristic Evaluation. Hall of Shame/Fame. Liya Ai & Zhe Jiang. Hall of Shame: FREESPIRIT treadmill. Poor visibility : Buttons are mixed all together Same color, shape Words attached to button are small Figures of incline and speed are small

ashley
Télécharger la présentation

Heuristic Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Loren Terveen CS 5115, Fall 2010 November 1 Heuristic Evaluation

  2. Hall of Shame/Fame Liya Ai & Zhe Jiang

  3. Hall of Shame: FREESPIRIT treadmill • Poor visibility: • Buttons are mixed all together • Same color, shape • Words attached to button are small • Figures of incline and speed are small • Difficult to get feedback when adjust the speed or incline. (words on the display small, far away from button) • Gulf of execution • Program up|down, but what is the meaning of P-0? P-1? • Poor mapping: 10---2 • number on the left smaller

  4. Hall of Shame for “small screen”

  5. Hall of Fame: WOODWAY treadmill • Good visibility • Different shapes • Different colors • Words larger • Circle background • Yellow: slow down, green go ahead • Triangle up or down • Good feedback and evaluation: • i.e. see value of speed • Bridge gulf of execution by put the function on the button • “CRAP”: • Most frequently used buttons in the center

  6. Hall of Fame/ShameBody Thermometer AdityaKulkarni Abdeltawab Hendawi

  7. Hall of Shame Glass Thermometer • Affordance Throwing, breaking, • Visibility: Poor Hard to read the mercury level. • Gulf of evaluation: BIG Low accuracy and precision due to the poor visibility • Feedback: None Can’t tell if its done. • Knowledge in world: Limited Celsius Vs Fahrenheit Don’t know how to reset

  8. Hall of Fame Digital Thermometer • Affordance Clicking • Mapping Buttons to start, reset, change C/F • Visibility Suitable screen to read the temperature • Feedback Sound after done • Knowledge in world Support both Celsius and Fahrenheit Button to Reset. • Gulf of evaluation : Small High accuracy and precision in the readings.

  9. Thank You

  10. Heuristic Evaluation

  11. Heuristic Evaluation Usability heuristics Heuristic evaluation method See http://www.useit.com/papers/heuristic/

  12. Pros / Cons + Cheap (no special lab or equipment)‏ + Easy + Fast (about 1 day)‏ + Cost-effective + Detects many problems without users +Coverage + Catches cross-task interactions - Requires subjective interpretation / application - Does not specify how to fix problems - Performance improves as evaluator knowledge increases

  13. ... vs. Cognitive Walkthroughs H.E.s are not task-centered H.E.s work better on higher fidelity prototypes (but can be done on LoFi)‏

  14. Procedure Independent evaluation: 3-5 evaluators Merge Rate Brainstorm

  15. Why multiple evaluators?

  16. Why multiple evaluators? Average over 6 case studies

  17. So how many evaluators? More is better...

  18. Cost-benefit analysis • Based on estimates of the value of finding problems and the cost of doing the evaluation • Note: a ratio of 50 means that investing $10K leads to value of $500K More is not necessarily better!

  19. Procedure: Job of each individual evaluator Inspect the UI Pass 1: get an overall feel for the system Pas 2: examine the various interface elements: do they follow the heuristics? Note: Nielsen’s heuristics can be augmented Output List of problems / violated heuristic(s)

  20. Severity ratings Based on Frequency Impact Persistence 1– cosmetic problem 2 – minor usability problem 3 – major usability problem; important to fix 4 – usability catastrophe – must fix

  21. Debriefing Brainstorm solutions Estimate effort

  22. The individual heuristics

  23. Nielsen’s heuristics

  24. H1. Simple and natural dialog Exploit the user’s conceptual model Match user tasks in as natural a way as possible Maximize mapping between interface and task semantics

  25. Simple and natural dialog Info should appear in natural order‏ Remove/hide irrelevant /rarely needed info Less is more – easier to learn, fewer errors, less distraction… Good graphic design Use grouping and proximity to present related info Use color appropriately

  26. H2. Speak the User’s Language • Use terminology based on user’s language for the task • Avoid engineering jargon • Use the user’s native language • Use conventional meanings • View the interaction from the user’s perspective • Do not force naming conventions • Exploit natural mappings and metaphors

  27. Poor use of language

  28. What does this do?

  29. H3. Reduce User Memory Load Promote recognition over recall Menus, icons, constrained input vs. command lines and free text fields Use examples Don’t make users remember information between actions or screens Leave information on the screen as long as it’s needed

  30. H4. Be consistent Consistent with: Task User model User experience (e.g., other interfaces)‏ Consistent within an application

  31. What should be consistent? Location of information (e.g., on menu bar)‏ Language / graphics Layout Input syntax Use boilerplate forms Effects – same commands should have the same effect in the same situation (predictability)‏

  32. Let’s look at some menus Word Powerpoint Firefox Gimp

  33. The Gimp

  34. 5. Provide Feedback System should continuously inform users what it is doing, how it is interpreting user actions Response time < 0.1 sec – seems instantaneous 0.1 – 1.0 sec – noticeable, breaks “DM illusion”, but doesn’t disrupt user’s train of thought > 10 sec – user’s will want to know this so they can work on other tasks Unpredictable amounts of time  progress bars Too fast can be bad  may need to animate changes

  35. What mode am I in? What did I select? How is the system interpreting my actions?

  36. As a file is being saved…

  37. 6. Provide clearly marked exits Users do not like to feel trapped Support exploration Universal undo Let users interrupt Restore defaults Cancel, Esc – get out of dialogs Quit – leave the program

  38. 7. Provide shortcuts Experienced users should be able to do frequent, familiar operations fast Keyboard and mouse accelerators Function keys Menu shortcuts Command completion Double-clicking to invoke default action Type-ahead Macros and scripting Reusable command/edit history ~60% of page visits are revisits

  39. Keyboard accelerators for menus Customizable toolbars and palettes for frequent actions Recently used files Scrolling within a page or by whole pages Right-click brings up menu

  40. 8. Provide good error messages Speak the user’s language! Use clear, simple, and polite language Be specific about the problem and offer possible solutions Turn it into a learning experience (encourage exploration)‏ But preferably…

  41. 9. Prevent Errors Look at error messages – could I prevent this instead? Confirm risky operations Minimize modes

More Related