500 likes | 512 Vues
Learn about discount evaluation techniques such as heuristic evaluation and cognitive walkthrough for quick and cost-effective usability assessment. Understand the basis, methods, and advantages of these techniques, along with their challenges. Explore Nielsen's heuristics and apply them to evaluate the usability of a library website.
E N D
Discount Evaluation Evaluating with experts
Agenda • Project was due today • You will demo your prototype next class • Heuristic Evaluation • Cognitive Walkthrough
Discount Evaluation Techniques • Basis: • Observing users can be time-consuming and expensive • Try to predict usability rather than observing it directly • Conserve resources (quick & low cost)
Approach - inspections • Expert reviewers used • HCI experts interact with system and try to find potential problems and give prescriptive feedback • Best if • Haven’t used earlier prototype • Familiar with domain or task • Understand user perspectives
Discount Evaluation Methods • Scenarios • Heuristic Evaluation • Cognitive Walkthrough
Heuristic Evaluation • Developed by Jakob Nielsen • Several expert usability evaluators assess system based on simple and general heuristics (principles or rules of thumb) (Web site: www.useit.com)
Heuristic Evaluation • Mainly qualitative • use with experts • predictive
Procedure • Gather inputs • Evaluate system • Debriefing and collection • Severity rating
1: Gather Inputs • Who are evaluators? • Need to learn about domain, its practices • Get the prototype to be studied • May vary from mock-ups and storyboards to a working system
How many experts? • Nielsen found thatabout 5 evaluations found 75% of the problems • Above that you get more, but at decreasing efficiency
2: Evaluate System • Reviewers evaluate system based on high-level heuristics. • Where to get heuristics? • http://www.useit.com/papers/heuristic/ • http://www.asktog.com/basics/firstPrinciples.html
use simple and natural dialog speak user’s language minimize memory load be consistent provide feedback provide clearly marked exits provide shortcuts provide good error messages prevent errors Heuristics
visibility of system status aesthetic and minimalist design user control and freedom consistency and standards error prevention recognition rather than recall flexibility and efficiency of use recognition, diagnosis and recovery from errors help and documentation match between system and real world Neilsen’s Heuristics
Groupware heuristics • Provide the means for intentional and appropriate verbal communication • Provide the means for intentional and appropriate gestural communication • Provide consequential communication of an individual’s embodiment • Provide consequential communication of shared artifacts (i.e. artifact feedthrough) • Provide Protection • Manage the transitions between tightly and loosely-coupled collaboration • Support people with the coordination of their actions • Facilitate finding collaborators and establishing contact Baker, Greenberg, and Gutwin, CSCW 2002
Ambient heuristics • Useful and relevant information • “Peripherality” of display • Match between design of ambient display and environments • Sufficient information design • Consistent and intuitive mapping • Easy transition to more in-depth information • Visibility of state • Aesthetic and Pleasing Design Mankoff, et al, CHI 2003
Process • Perform two or more passes through system inspecting • Flow from screen to screen • Each screen • Evaluate against heuristics • Find “problems” • Subjective (if you think it is, it is) • Don’t dwell on whether it is or isn’t
3: Debriefing • Organize all problems found by different reviewers • At this point, decide what are and aren’t problems • Group, structure • Document and record them
4: Severity Rating • Based on • frequency • impact • persistence • market impact • Rating scale • 0: not a problem • 1: cosmetic issue, only fixed if extra time • 2: minor usability problem, low priority • 3: major usability problem, high priority • 4: usability catastrophe, must be fixed
Advantages • Few ethical issues to consider • Inexpensive, quick • Getting someone practiced in method and knowledgeable of domain is valuable
Challenges • Very subjective assessment of problems • Depends of expertise of reviewers • Why are these the right heuristics? • Others have been suggested • How to determine what is a true usability problem • Some recent papers suggest that many identified “problems” really aren’t
Your turn: • Library – main page, finding a book • Use Nielsen’s heuristics (p 686) • List all problems • Come up to the board and put up at least one new one • We’ll rate as a group
visibility of system status aesthetic and minimalist design user control and freedom consistency and standards error prevention recognition rather than recall flexibility and efficiency of use recognition, diagnosis and recovery from errors help and documentation match between system and real world Neilsen’s Heuristics
Heuristic Evaluation: review • Design team provides prototype and chooses a set of heuristics • Experts systematically step through entire prototype and write down all problems • Design team creates master list, assigns severity rating • Design team decides how to modify design
Cognitive Walkthrough • Assess learnability and usability through simulation of way users explore and become familiar with interactive system • A usability “thought experiment” • Like code walkthrough (s/w engineering) • From Polson, Lewis, et al at UC Boulder
Cognitive Walkthrough • Qualitative • Predictive • With experts • to examine learnability and novice behavior
CW: Process • Construct carefully designed tasks from system spec or screen mock-up • Walk through (cognitive & operational) activities required to go from one screen to another • Review actions needed for task, attempt to predict how users would behave and what problems they’ll encounter
CW: Requirements • Description of users and their backgrounds • Description of task user is to perform • Complete list of the actions required to complete task • Prototype or description of system
CW: Assumptions • User has rough plan • User explores system, looking for actions to contribute to performance of action • User selects action seems best for desired goal • User interprets response and assesses whether progress has been made toward completing task
CW: Methodology • Step through action sequence • Action 1 • Response A, B, .. • Action 2 • Response A • ... • For each one, ask four questions and try to construct a believability story
CW: Questions • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? (is it visible) • Once found, will they know it’s the right one for desired effect? (is it correct) • Will users understand feedback after action?
CW: Answering the Questions • 1. Will user be trying to produce effect? • Typical supporting evidence • It is part of their original task • They have experience using the system • The system tells them to do it • No evidence? • Construct a failure scenario • Explain, back up opinion
CW: Next Question • 2.Will user notice action is available? • Typical supporting evidence • Experience • Visible device, such as a button • Perceivable representation of an action such as a menu item
CW: Next Question • 3.Will user know it’s the right one for the effect? • Typical supporting evidence • Experience • Interface provides a visual item (such as prompt) to connect action to result effect • All other actions look wrong
CW: Next Question • 4.Will user understand the feedback? • Typical supporting evidence • Experience • Recognize a connection between a system response and what user was trying to do
User characteristics • Technology savy users • Familiar with computers • Understand Internet radio concept • Just joined and downloaded this radio
Task: find a new station to presets • Click genre • Scroll list and choose genre • Assuming station is on first page, add station to presets -- right-click on station, choose add to presets from popup menu. • Click OK on Presets
Task:Click – Pick a genre • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? • Once found, will they know it’s the right one for desired effect? • Will users understand feedback after action?
Action: Add to Preset • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? • Once found, will they know it’s the right one for desired effect? • Will users understand feedback after action?
Action: Click OK • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? • Once found, will they know it’s the right one for desired effect? • Will users understand feedback after action?
Advantages Explores important characteristic of learnability Novice perspective Detailed, careful examination Working prototype not necessary Disadvantages Can be time consuming May find problems that aren’t really problems Narrow focus, may not evaluate entire interface CW Summary
Your turn • Library finding book • What are our tasks? • What are the actions?
CW: Questions • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? (is it visible) • Once found, will they know it’s the right one for desired effect? (is it correct) • Will users understand feedback after action?
CW: review • Design team creates prototype, user characteristics • Design team chooses tasks, lists out every action and response • Experts answer 4 questions for every action/response • Design team gathers responses and feedback • Design determines how to modify the design
Next time • Heuristic evaluation of your own prototypes • Bring to class • Your prototype • Any other sketches, storyboards, etc. that you have to convey your design • Set of heuristics you want them to use, print them out
Process • Pairs of teams will evaluate each other • One team demo, then the other team • Then write down everything you find and give list to the team • Project team compiles all the problems, assign severity rating, include in your Part 3 writeup • TA and I will interrupt each team to demo their prototype
Heuristic Eval team pairs • Mighty Morphin and UCook • College Registration and Laguna Bleach • One Password and Food Networking • Team No Name and Best for Last
Homework: last one • Cognitive Walkthrough of each other’s prototypes • Gather materials for your prototype: • User characteristics • tasks (1 or 2) and action lists • Storyboard or running prototype • Perform CW for another group
Homework • Best for Last and Mighty Morphin • UCook and College Registration • Laguna Bleach and One Password • Food Networking and No Name • To turn in on your group page: • List of tasks and actions for your prototype • Date and people who evaluated your prototype • Due: April 15 after class