1 / 38

Evaluation

Overview. To explain the key concepts and terms used in evaluation. To introduce three main evaluation evaluation approaches and key evaluation methods.To introduce and explain the DECIDE evaluation framework.. . . . . . . . . . Evaluation in the design process. Evaluate. (Re)Design. Identify needs/ establish requirements.

aneko
Télécharger la présentation

Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Evaluation Approaches, methods and framework

    2. Overview To explain the key concepts and terms used in evaluation. To introduce three main evaluation evaluation approaches and key evaluation methods. To introduce and explain the DECIDE evaluation framework.

    3. Evaluation in the design process

    4. Why, what, where and when to evaluate Why: to check that designers understand requirements, that users can use the product, that they are satisfied with it. What: a conceptual model, early prototypes of a new system and later, more complete prototypes. Where: in natural and laboratory settings. When: throughout design; finished products can be evaluated to collect information to inform new products.

    5. Evaluation approaches and methods

    6. Usability testing Testing products, not users. Representative tasks and users Controlled environmental settings. Typical methods User test Users observed and timed Estimate performance and errors User satisfaction Questionnaire and interviews Obtain impressions and opinions

    7. Usability testing & research Usability testing Improve products Few participants Results inform design Usually not completely replicable Conditions controlled as much as possible Procedure planned Results reported to developers Experiments for research Discover knowledge Many participants Results validated statistically Must be replicable Strongly controlled conditions Experimental design Results reported to scientific community

    8. Usability lab and equipment

    9. Typical data gathered Time to complete a task. complete a task after a specified time away from the product. Errors Number and type of errors per task. Number of users making a particular error. Number of navigations to online help or manuals. Number of users completing task successfully.

    10. Field studies Performed in natural settings. Aim is to understand what users do naturally and how technology impacts them. Can be used to: identify opportunities for new technology; determine design requirements; decide how best to introduce new technology; evaluate technology in use.

    11. Data collection & analysis Observation & interviews Notes, pictures, recordings Video Logging Analysis Categorising Interpreting

    12. Analytical evaluation Inspections Heuristic evaluation Walkthroughs Predictive models

    13. Inspections Several kinds. Experts use their knowledge of users & technology to review software usability. Expert critiques (crits) can be formal or informal reports. Heuristic evaluation is a review guided by a set of heuristics. Walkthroughs involve stepping through a pre-planned scenario noting potential problems.

    14. Heuristic evaluation Developed Jacob Nielsen in the early 1990s. Based on usability principles or heuristics distilled from an empirical analysis of 249 usability problems. These heuristics have been revised for current technology. mobile devices, wearables, virtual worlds, etc. Design principles form a basis for developing heuristics.

    15. Nielsens heuristics Visibility of system status. Match between system and real world. User control and freedom. Consistency and standards. Error prevention. Recognition rather than recall. Flexibility and efficiency of use. Aesthetic and minimalist design. Help users recognize, diagnose, recover from errors. Help and documentation.

    16. Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems. Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

    17. 3 stages in heuristic evaluation Briefing session to tell experts what to do. Evaluation period of 1-2 hours in which: Each expert works separately; Take one pass to get a feel for the product; Take a second pass to focus on specific features. Debriefing session in which experts work together to prioritise problems.

    18. Advantages and disadvantages Few ethical & practical issues to consider because users not involved. Can be difficult & expensive to find experts. Best experts have knowledge of application domain & users. Many issues reported are not problems and many problems are not spotted.

    19. Cognitive walkthroughs Focus on ease of learning. Designer presents an aspect of the design & usage scenarios. Expert is told the assumptions about user population, context of use, task details. One of more experts walk through the design prototype with the scenario. Experts are guided by 3 questions.

    20. The 3 questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems.

    21. Pluralistic walkthrough Variation on the cognitive walkthrough theme. Performed by a carefully managed team. The panel of experts begins by working separately. Then there is managed discussion that leads to agreed decisions. The approach lends itself well to participatory design.

    22. Predictive models Provide a way of evaluating products or designs without directly involving users. Evaluation in term of predictions of time and errors. Less expensive than user testing. Usefulness limited to systems with predictable tasks - e.g., telephone answering systems, mobiles, cell phones, etc. Based on expert error-free behavior.

    23. Characteristics of approaches

    24. DECIDE: a framework to guide evaluation Determine the goals. Explore the questions. Choose the evaluation approach and methods. Identify practical issues. Decide how to deal with the ethical issues. Evaluate, analyze, interpret and present the data.

    25. Determine the goals What are the high-level goals of the evaluation? Who wants it and why? Some examples of goals: Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Find out why many customers prefer to purchase paper airline tickets rather than e-tickets

    26. Explore the questions All evaluations need goals & questions to guide them. E.g., the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions: What are customers attitudes to these new tickets? Are they concerned about security? Is the interface for obtaining them poor?

    27. Choose the evaluation approach & methods The evaluation approach influences the methods used, and in turn, how data is collected,analyzed and presented. E.g. field studies typically: Involve observation and interviews. Do not involve controlled tests in a laboratory. Produce qualitative data.

    28. Practical and ethical issues Identify practical issues Select users Stay on schedule Find evaluators Select equipment Decide how to deal with ethical issues Know the goals of the study; Know what will happen to the findings; Privacy of personal information; Leave when they wish; Be treated politely.

    29. Evaluate, interpret & present data The approach and methods used influence how data is evaluated, interpreted and presented. The following need to be considered: Reliability: can the study be replicated? Validity: is it measuring what you expected? Biases: is the process creating biases? Scope: can the findings be generalized? Ecological validity: is the environment influencing the findings?

    30. Case study: eSPACE Holiday planning system (Scaife, Halloran & Rogers, 2002) Collaboration between HCI researchers and travel agent company

    31. eSPACE stages Initial ethnographic study Conceptual model and low-fi prototype Conceptual model evaluation Low-fi prototype refinement Low-fi evaluation High-fi development High-fi evaluation

    32. Initial ethnographic study Six months study at various travel companies Observing Video recording Interviewing agents and customers Physical asymmetry Awkward Customer does nothing Representational asymmetry Different information and representations Representations created also different Translation overheads

    33. Requirements Change the technology Improve physical arrangement Change the information Shared planning representations and information resources Link them

    34. Conceptual model Needed to represent location, timeline, budget and additional information Lab settings study Asked participants to plan a trip Provided them with budget, brochure, list of destinations, pen and paper Chronological list frequently edited Map main representation used Created representations for timeline and budget

    35. Second low-fi Single or multiple representations Single representation too complicated? Multiple representations too much integration overhead? Lab study Single representation too complicated and novel

    36. High-fi prototype Used same representations as low-fi Representations linked Two versions Table Three screen system Managing of windows difficult in table version Representations locked into monitors in three screen version easier to use

    37. High-fi evaluation Feasibility of adoption Does it increase agents efficiency? Does it improve customers experience? Ethnographic-style study and customer interviews Largely in trade shows and promotion evenings Agents Positive attitude Wary on adoption Customers Extremely positive Remembered store session in detail

    38. Key points Evaluation & design are closely integrated in user-centred design. Three main evaluation approaches are: usability testing, field studies, and analytical evaluation. The main methods are:observing, asking users, asking experts, user testing, inspection, and modelling users task performance. Different evaluation approaches and methods are often combined in one study. The DECIDE framework provides a useful checklist for planning an evaluation study.

    39. Further reading and resources Chapters 12 - 15 of the textbook Chapter 9 of Dix, A., Finley, J., Abowd, G., and Beale, R. 2004 Human-Computer Interaction (3rd Ed.). Prentice-Hall, Inc. Chapters 12 and 22 of Benyon D, Turner P, Turner S. Designing interactive systems. Harlow, England: Addison-Wesley; 2005. http://www.id-book.com/catherb/ (Interactive tool for heuristics selection) http://www.useit.com/ (Jakob Nielsen's Website)

More Related