1 / 25

The next two weeks

The next two weeks. Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct 30: Midterm in class No office hours (out of town). Midterm material. Everything up to exactly this point (including DemoCustomDialog)

lawandab
Télécharger la présentation

The next two weeks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The next two weeks • Oct 21 & 23: • Lectures on user interface evaluation • Oct 28: • Lecture by Dr. Maurice Masliah • No office hours (out of town) • Oct 30: • Midterm in class • No office hours (out of town)

  2. Midterm material • Everything up to exactly this point (including DemoCustomDialog) • Things to study: • Slides • Programs • Javadoc • No need to memorize all methods of Swing classes. Familiarity with the most common ones will be tested though.

  3. Evaluating User Interfaces Material taken mostly from “Interaction Design” (Preece, Rogers, Sharp 2002)

  4. User Interface Humor

  5. User Interface Evaluation • Users want systems that are easy to learn and use • Systems also have to be effective, efficient, safe, satisfying • Important to know: • What to evaluate • Why it is important • When to evaluate

  6. What to evaluate • All evaluation studies must have specific goals and must attempt to address specific questions • Vast array of features • Some are best evaluated in a lab, e.g. the sequence of links to find a website • Others are better evaluated in natural settings, e.g. whether children enjoy a particular game

  7. Why it is important to evaluate • Problems are fixed before the product is shipped, not after • One can concentrate on real problems, not imaginary ones • Developers code instead of debating • Time to market is sharply reduced • Finished product is immediately usable

  8. When to evaluate • Ideally, as early as possible (from the prototyping stage) and then repeatedly throughout the development process. “Test early and often.”

  9. Evaluation Paradigms • “Quick and Dirty” evaluation • Usability Testing • Field studies • Predictive evaluation

  10. “Quick and Dirty” evaluation • User-centered, highly practical approach • Used when quick feedback about a design is needed • Can be conducted in a lab or the user’s natural environment • Users are expected to behave naturally • Evaluators take minimum control • Sketches, quotes, descriptive reports are fed back into the design process

  11. Usability Testing • Applied approach based on experimentation • Used when a prototype or a product is available • Takes place in a lab • Users carry out set tasks • Evaluators are strongly in control • Users’ opinions collected by questionnaire or interview • Reports of performance measures, errors etc. are fed back into the design process

  12. Field studies • Often used early in design to check that users’ needs are met or to assess problems or design opportunities • Conducted in the user’s natural environment • Evaluators try to develop relationships with users • Qualitative descriptions that include quotes, sketches, anecdotes are produced

  13. Predictive evaluation • Do not involve users • Expert evaluators use practical heuristics and practitioner expertise to predict usability problems • Usually conducted in a lab • Reviewers provide a list of problems, often with suggested solutions

  14. Evaluation techniques • Observing users • Asking users their opinions • Asking experts their opinions • Testing users’ performance • Modeling users’ task performance to predict the efficacy of a user interface

  15. The DECIDE framework • Determine the overall goals that the evaluation addresses • Explore the specific questions to be answered • Choose the evaluation paradigm and techniques • Identify practical issues • Decide how to deal with the ethical issues • Evaluate, interpret, and present the data

  16. Determine the overall goals • What are the high level goals of the evaluation? • Examples: • Check that evaluators have understood the users’ needs • Ensure that the final interface is consistent • Determine how to improve the usability of a user interface

  17. Explore specific questions • Break down overall goals into relevant questions • Overall goal: Why do customers prefer paper tickets to e-tickets? • Specific questions: • What is the customer’s attitude? • Do they have adequate access to computers? • Are they concerned about security? • Does the electronic system have a bad reputation? • Is it’s user interface poor?

  18. Choose paradigm and techniques • Practical and ethical issues might be considered • Factors: • Cost • Timeframe • Available equipment or expertise • Compromises may have to be made

  19. Identify practical issues • Important to do this before starting • Find appropriate users • Decide on the facilities and equipment to be used • Schedule and budget constraints • Prepare testing conditions • Plan how to run the tests

  20. Decide on ethical issues • Studies involving humans must uphold a certain code • Privacy of subjects must be protected • Personal records must be kept confidential • Exact description of the experiment must be submitted for approval

  21. Evaluate the data • Should quantitative data be treated statistically? • How to analyze qualitative data? • Issues to consider: • Reliability (consistency) • Validity • Biases • Scope • Ecological validity

  22. We’ll take a closer look at… • Two predictive evaluation techniques: • Heuristic evaluation • Cognitive walkthroughs • A usability testing technique • User testing

  23. Heuristic Evaluation • Heuristic evaluation is a technique in which experts, guided by a set of usability principles known as heuristics, evaluate whether user interface elements conform to the principles. • Developed by Jakob Nielsen • Heuristics bear a close resemblance to design principles and guidelines • Interesting article on Heuristic Evaluation:http://www.useit.com/papers/heuristic/heuristic_evaluation.html

  24. List of heuristics • Visibility of system status • Match between system and the real world • User control and freedom • Consistency and standards • Help users recognize, diagnose, and recover from errors

  25. List of heuristics (cont.) • Error prevention • Recognition rather than recall • Flexibility and efficiency of use • Aesthetic and minimalist design • Help and documentation

More Related