1 / 17

Multimedia Specification Design and Production

Multimedia Specification Design and Production. 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr. More about Evaluation. Learning outcomes Evaluation (more about this topic) Evaluation methods Empirical evaluation (necessary steps) Analytical evaluation.

Télécharger la présentation

Multimedia Specification Design and Production

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr

  2. More about Evaluation • Learning outcomes • Evaluation (more about this topic) • Evaluation methods • Empirical evaluation (necessary steps) • Analytical evaluation

  3. More about Evaluation Reading List: Notes for Lecture week_8: introduction to evaluation Faulkner Chapter 6, pp 137 – 146 (stop before 6.5.1), 6.6. – 6.16, pp 156 – 173 and Chapter 7, pp 177 - 196

  4. More about Evaluation Evaluation Central to user-centred iterative development, carried out throughout the development process (however often developers feel they can undertake by themselves) Linked to every other activity in the design cycle Developers are often tempted to skip it because it can add to the development time and costs money and effort

  5. More about Evaluation • EVALUATION provides the opportunity to: • help to ensure that the system is usable • help to ensure what users want in the final system • end up cheaper than fixing problems identified later on • Evaluation is the process of gathering data so we can answer the questions! • why you are doing it • what you hope to achieve within the inevitable practical constraints: • availability to facilities/equipment • easy access to users • expertise • time • budget • ethical issues

  6. An introduction to Evaluation Terminology Evaluation …the process of systematically gathering data at various stages within the development process, which can be used to improve the designers’ understanding of the users’ requirements and amend the design to meet users’ needs. It can employ a range of techniques, some involving users directly, at different stages, to examine different aspects of the design

  7. More about Evaluation Evaluation methods 1. Empirical 2. Analytical …based on the expert view …based on user experience 3. Heuristic

  8. More about Evaluation 1. Empirical evaluation: necessary steps • it should start with a clear understanding of what questions need to be answered – i.e. what the evaluation aims to find out, set appropriate targets • developing the evaluation activity • selecting participants to perform tasks • developing tasks for participants to perform (benchmark and • representative tasks) – give the participants focused activities • determining protocol and procedures for the evaluation sessions • pilot testing may be necessary to improve experiment directing the evaluation sessions

  9. More about Evaluation 1. Empirical evaluation: necessary steps • generating data by: • quantitative: benchmark tasks, user questionnaires • qualitative: concurrent verbal protocol, retrospective verbal protocol, • critical incident reporting, structured interviews • collecting data - in order to have evidence on which to base evaluation • real-time note-taking • audiotaping • videotaping • internal instrumentation of the interface

  10. More about Evaluation 1. Empirical evaluation: necessary steps analyzing the data, comparing results with targets in usability specifications drawing conclusions to form a resolution of each design problem depending on the outcome of the evaluation, it may be necessary to redesign and implement the revisions

  11. More about Evaluation 2. Analytical evaluation The following section of notes has been compiled from papers on Jakob Nielson’s website http://www.useit.com. Faulkner Chapter 7 provides more detail about the various methods briefly outlined below. Usability inspection is a set of methods that are all based on having evaluators inspect and analyse a user interface. Typically, usability inspection is aimed at finding usability problems in the design, though some methods could evaluate the overall usability of an entire system. Inspection methods can be applied early in the interaction development lifecycle, allowing feedback, iteration and improvement.

  12. More about Evaluation 2. Analytical evaluation • Heuristic evaluation is the most informal method and involves having usability specialists judge • Heuristic estimation is a variant in which the inspectors are asked to estimate the relative usability of two (or more) designs in quantitative terms (typically expected user performance).

  13. More about Evaluation 2. Analytical evaluation • Cognitive walkthrough uses a more detailed procedure to simulate a user's problem-solving process at each step through the dialogue, checking if the simulated user's goals and memory content can be assumed to lead to the next correct action. • Pluralistic walkthrough uses group meetings where users, developers, and human factors people step through a scenario, discussing each dialogue element.

  14. More about Evaluation 2. Analytical evaluation • Feature inspection lists sequences of features used to accomplish typical tasks, checks for long sequences, cumbersome steps, steps that would not be natural for users to try, and steps that require extensive knowledge/experience in order to assess a proposed feature set. • Consistency inspection has designers who represent multiple other projects inspect an interface to see whether it does things in the same way as their own designs.

  15. More about Evaluation 2. Analytical evaluation • Standards inspection has an expert on an interface standard inspect the interface for compliance. • Formal usability inspection combines individual and group inspections in a six-step procedure with strictly defined roles to with elements of both heuristic evaluation and a simplified form of cognitive walkthroughs.

  16. More about Evaluation 2. Analytical evaluation Heuristic evaluation, heuristic estimation, cognitive walkthrough, feature inspection, and standards inspection normally have the interface inspected by a single evaluator at a time. * In contrast pluralistic walkthrough and consistency inspection are group inspection methods. Many usability inspection methods are so easy to apply that it is possible to have regular developers serve as evaluators, though better results are normally achieved when using usability specialists.

  17. More about Evaluation 3. Heuristic evaluation • Heuristics – broad based rules or principles derived from theoretical knowledge (e.g. cognitive psychology) and practical experience • heuristic evaluation is the most popular usability inspection method • heuristics can be used to inform the design, as well as providing a checklist for evaluation • heuristic evaluation allows quick, cheap and easy evaluation of a user interface design, hence known as a ‘discount usability engineering’ method • heuristic evaluation aims to identify usability problems which can then be fixed within the iterative design process • heuristic evaluation involves a small set of evaluators examining an interface and judging its compliance with recognized usability principles

More Related