1 / 27

System Evaluation: Usability Assessment

System Evaluation: Usability Assessment. Dr. Dania Bilal IS 582 Spring 2009. What is Usability?. Evaluation mechanism that measures multiple components of the design of the user interface Addresses the relationships between a system and its users

channer
Télécharger la présentation

System Evaluation: Usability Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. System Evaluation:Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

  2. What is Usability? • Evaluation mechanism that measures multiple components of the design of the user interface • Addresses the relationships between a system and its users • Emerges from the human-computer interaction field • based on user-centered interface design principles

  3. Importance of Usability • Bridges the gap between human and machines • Provides information about the user experience and goals • Measures system effectiveness in relation to its intended users rather than to system specifications • The sooner problems are found, the less expensive it will be to fix them • Saves money in product cost

  4. Importance of Usability • Learn about • users and their goals • Difficulty achieving tasks • System design problems that contribute to user failures • Generate requirements for improving existing or creating new user-centered design

  5. Usability Attributes • As described by Jakob Nielsen • Learnability • Efficiency • Memorability • Errors & their severity • Subjective satisfaction

  6. Learnability • System must be easy to learn, especially for novice users • Hard to learn system is designed for expert users • Command-driven systems such as Dialog is designed for users who are able to learn the commands and construct search strategies accordingly.

  7. Efficiency • System should be efficient to use so that once the user has learned how to use it, the user can achieve a high level of productivity • Efficiency increases with learning • Efficiency differ from effectiveness • Efficiency and effectiveness are indications of success

  8. Memorability • System should be easy to remember • No need to learn how to use system all over again after a period of not using it • System feature (searching, browsing, finding hidden features, etc.) are easy to remember in terms of • How? how to find them • What? what they are and what they do • Where? where they are in the system

  9. Errors • System should have a low error rate • System should provide user with a recovery mechanism • how to correct a problem, what to do next, suggestion for correcting a problem, etc. • Two types of errors: • Minor errors • Major errors

  10. Minor Errors • Errors that did not greatly slow down user’s interaction with the system • User is able to recover from them • through system feedback • through awareness of error made • Not considered catastrophic

  11. Major Errors • Difficult to recover from them • Lead to faulty work if high in frequency • May not be discovered by the user • Considered catastrophic • Affect productivity • Cause negative affect • Cause abandoning system

  12. Subjective Satisfaction • System should be likeable by users • System should meet user goals • Satisfaction • Positive experience • Sense of achievement • Willingness to user system again

  13. Assumptions • The designer’s best guess is not good enough • The user is always right • The user is not always right • Users are not designers • Designers are not users • More features are not always better • Minor interface details matter • Online help does not really help Nielsen, J. (1993). Usability Engineering. San Diego: Morgan Kaufman.

  14. Jakob Nielsen’s 10 Usability Heuristics • Visibility of system status • Match between system and real world • User control and freedom • Consistency & standards • Error prevention • Error diagnosis and recovery • Recognition rather than recall • Flexibility & efficiency of use • Aesthetic and minimalist design • Help and documentation

  15. Eight Golden Rules by Shneiderman & Plaisant (2009) • Strive for consistency • Cater to universal usability • Offer informative feedback • Design dialogue to lead closure • Prevent errors • Permit easy reversal of actions • Support internal locus of control • Reduce short term memory load Designing the User Interface, chapter 2, pp. 74-75.

  16. Address the functionality of the system vis-à-vis: user goals and tasks user expectations user cognitive processes, mental model, conceptualization of system use user methods for solving tasks context system interface design Factors Influencing User Interaction

  17. Usability Methods

  18. What Method(s) to Use? • One or a mix of methods can be used • Depends on project usability goals • Budget • Time constraints • Availability of experts for data collection and analysis • User recruitment to participate in data collection • Complexity of system to be assessed

  19. Cognitive walkthrough method • Focuses on how a user might achieve a task and the problems he/she might experience • Experts play role of the user and put themselves in the user’s shoes in performing each task • Experts perform tasks given by experimenter (project leader or observer) • Experts take notes while performing each tasks keeping the intended user in mind

  20. Cognitive walkthrough method • Role of Experimenter/observer • meets with experts prior to data collection to decide on parts of system interface to evaluate, broad tasks, and other matters as needed • prepares specific tasks • develops usability materials • provides introduction about data collection • decides time and location of data collection individually or in consultation with experts • schedules debriefing session with experts to discuss collected data • develops benchmarks by which to evaluate success on each task performed by each expert • reviews draft of data collection compiled by each expert • analyzes collected data • meets with experts to discuss draft of the usability report • compiles the final usability report • See also Project #2 for additional information.

  21. Heuristic Evaluation/Inspection • Evaluators individually interact with an interface and evaluate it based on predefined sets of heuristics/rules/guidelines • Each evaluator goes through the interface and rates each usability problem or violation found based on a severity rating scale

  22. Heuristic Evaluation/Inspection • Each evaluator compiles a usability report containing • each feature evaluated in the interface • severity rating of each usability problem found in interface • explanation of each usability problem saying why this is a problem • suggestion for solving the problem

  23. Heuristic Evaluation/Inspection • One of the evaluators (or team leader) compiles the evaluators’ reports • Team leader aggregates the results of the reports (should calculate the mean value of severity ratings given by the evaluators) • Generates the final usability report

  24. Severity of System Problems • Combination of 3 factors • Frequency of problem occurrence • Common or rare? • Impact • Is problem easy or difficult to user to overcome? • Persistence • Is problem common and user can’t overcome or is it one-time problem and user can overcome?

  25. Severity Rating Scale of Usability Problems 0= I don't agree that this is a usability problem at all 1= Cosmetic problem only: need not be fixed unless extra time is available on project 2= Minor usability problem: fixing this should be given low priority 3= Major usability problem: important to fix, so should be given high priority 4= Usability catastrophe: imperative to fix this before product can be released Nielson, Jakob. http://www.useit.com/papers/heuristic/severityrating.html

  26. Class Activity • Visit the Library of Congress OPAC. • Select a component of the interface to evaluate against a combined list of heuristics and Rules.

  27. Sources • http://www.usabilityfirst.com/methods • http://www.useit.com/papers/heuristic/heuristic_list.html (Nielsen’s usability heuristics) • http://www.useit.com/papers/heuristic/heuristic_evaluation.html (how to conduct a heuristic evaluation) • http://www.uie.com/articles (collection of articles) • http://www.uie.com/articles/usability_tests_learn/ Learning about usability test (Jared Spool) • http://www.useit.com/papers/heuristic/severityrating.html (Nielsen’s Severity Rating Scale)

More Related