1 / 44

Designing the User Interface and Evaluating Usability Chapters 11-12

Designing the User Interface and Evaluating Usability Chapters 11-12. Agenda. Learning Objectives Forms and Reports GUI design goals Individual differences Some design guidelines Intro to usability evaluation. Learning Objectives.

tulia
Télécharger la présentation

Designing the User Interface and Evaluating Usability Chapters 11-12

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing the User Interface and Evaluating UsabilityChapters 11-12

  2. Agenda • Learning Objectives • Forms and Reports • GUI design goals • Individual differences • Some design guidelines • Intro to usability evaluation

  3. Learning Objectives • Explain process of designing forms and reports and deliverables for creating them • Be able to apply guidelines for effective form and report design, including using color and formatting text, tables, and lists • Explain how to assess usability and describe how variations in users, tasks, technology, and environmental characteristics influence the usability of forms and reports

  4. Designing Forms and Reports Chapter 11 © 2008 by Prentice Hall 4

  5. Forms and Reports • Forms • Contain predefined data and areas for data to be filled in • Data elements correspond to dataflow contents on DFD • Reports or Query Results • Contain only predefined or derived data • Data elements correspond to dataflow contents on DFD

  6. Designing Forms and Reports (Cont.) Form: a business document that contains some predefined data and may include some areas where additional data are to be filled in. An instance of a form is typically based on one database record. Chapter 11 © 2008 by Prentice Hall 6

  7. Designing Forms and Reports (Cont.) Report: A business document that contains only predefined data; It is a passive document used solely for reading or viewing data. A report typically contains data from many unrelated records or transactions. Chapter 11 © 2008 by Prentice Hall 7

  8. Designing Forms and Reports (Cont.) Common Types of Reports: Scheduled: produced at predefined time intervals for routine information needs. Key-indicator: provide summary of critical information on regular basis. Chapter 11 © 2008 by Prentice Hall 8

  9. Designing Forms and Reports (Cont.) Exception: highlights data outside of normal operating ranges. Drill-down: provide details behind summary of key-indicator or exception reports. Ad-hoc: respond to unplanned requests for non-routine information needs. Chapter 11 © 2008 by Prentice Hall 9

  10. The Process of Designing Forms and Reports User-focused activity. Follows a prototyping approach. First steps are to gain an understanding of the intended user and task objectives by collecting initial requirements during requirements determination. Chapter 11 © 2008 by Prentice Hall 10

  11. The Process of Designing Forms and Reports Requirements determination: Who will use the form or report? What is the purpose of the form or report? When is the report needed or used? Where does the form or report need to be delivered and used? How many people need to use or view the form or report? Chapter 11 © 2008 by Prentice Hall 11

  12. The Process of Designing Forms and Reports (Cont.) Prototyping Initial prototype is designed from requirements. Users review prototype design and either accept the design or request changes. If changes are requested, the construction-evaluation-refinement cycle is repeated until the design is accepted. Chapter 11 © 2008 by Prentice Hall 12

  13. The Process of Designing Forms and Reports (Cont.) A coding sheet is an “old” tool for designing forms and reports, usually associated with text-based forms and reports for mainframe applications. Visual Basic and other development tools provide computer aided GUI form and report generation. Chapter 11 © 2008 by Prentice Hall 13

  14. Deliverables • Design (Content and Layout) Specifications • Narrative overview • Target users, tasks, system, environmental factors in which form or report will be used (i.e., who, what, where, when, and why or how) • Sample design • Testing and usability assessment

  15. Example of Poor Design

  16. Example of Better Design

  17. Uses of Highlighting in Forms and Reports • Notify users of errors in data entry or processing. • Provide warnings regarding possible problems. • Draw attention to keywords, commands, high-priority messages, unusual data values.

  18. Benefits from Using Color Soothes or strikes the eye Accents an uninteresting display Facilitates subtle discriminations in complex displays Emphasizes the logical organization of information Draws attention to warnings Evokes more emotional reactions Problems from Using Color Color pairings may wash out or cause problems for some users Resolution may degrade with different displays Color fidelity may degrade on different displays Printing or conversion to other media may not easily translate Color vs. No Color

  19. Usability Evaluation “Sirs, I have tested your machine. It adds a new terror to life and makes death a long felt want.” - Sir Herbert Beerbohm Tree on examining a gramophone player

  20. For Starters... What do you think makes something “usable”? For the web: what makes a website “sticky” What websites do you find to be the most usable? Why?

  21. General GUI Design Goals • Aesthetically Pleasing • Contrast between screen elements • Groupings of like items • Align elements and groups • Use color effectively and simply • Clarity • Compatibility • With user • With task • The related systems

  22. General GUI Design Goals • Comprehensibility • Easy to learn and understand • What to look at, what to do, when to do it, why to do it, how to do it • Configurability • Sense of control • Allow for personal preferences • Consistency • Similarity in look • Same action yields same result • Same location

  23. General GUI Design Goals (cont’d) • Control • Explicit user requests • Quick reaction • Actions capable of interruption or termination • Customizable, proper defaults • Directness • Efficiency • Familiarity • Familiar concepts and language • Mimic natural behavior • Use real-world metaphors

  24. General GUI Design Goals (cont’d) • Flexibility • Sensitive to individual differences (knowledge, skills, experience, conditions) • Forgiveness • Tolerant of common and unavoidable errors • Prevent errors whenever possible • Protect against catastrophic errors • Provide constructive error messages • Predictability • Recovery • Able to reverse actions

  25. General GUI Design Goals (cont’d) • Responsiveness • Rapid response • Acknowledge user action (visual, text, audio) • Simplicity • Hide until needed • Common functions first • Defaults • Make common actions simple and uncommon actions more work • Transparency • User can focus on task without unneeded information on software actions

  26. Individual Differences • Knowledge/Experiences -- e.g…. • Computer literacy, Application experience, Task experience, Reading level, Native language • Job/Task -- e.g…. • Frequency of use, Turnover rate, Task importance • Psychological Characteristics -- e.g…. • Attitude, Motivation, Cognitive style • Physical Characteristics -- e.g…. • Age, Gender, Handedness, Handicaps

  27. Individual Differences • Explain what usability guidelines for configurability, simplicity, and responsiveness would mean for the following: • A novice vs. experienced airline reservations clerk • A manager reviewing sales • A customer at an ATM

  28. Evaluating Usability • Role of guidelines • Usability engineering • Some basics of web usability

  29. Metrics for Assessing Usability • Time to learn • Speed of performance • Rate of errors • Retention over time • Subjective satisfaction

  30. Why Do Usability Evaluation? • Goal of “satisfying” the user is not good enough • Instead, emphasis should be on “delighting” the user • System’s usability is THE key issue for users • Can differentiate product from competitors • Builds loyalty and goodwill from users

  31. So, what is “usability”? • How would you define? • Working definition (Shackel, 1991): • a system’s capability in human terms to be used easily and effectively by the specified range of users, given specified training and support, to fulfill a given set of tasks...

  32. Perhaps this is easier... • What usability isn’t: • A “stamp of approval” or “certification” • Easy for anyone to operate • Lots of features • Adherence to a set of “guidelines” • All of these may be good, but do not make a system inherently usable.

  33. So, what’s wrong with “guidelines”? • Nothing, if used the right way • But, not a shortcut to designing usable systems • Adhering to guidelines does not deem the system “usable”

  34. Web Design Guidelines, Examples • IBM Web Usability Guidelines • “Use frames in top and left areas of the page…” • Jakob Nielsen’s Top Ten Mistakes in Web Design • “Mistake #1. Use of frames”

  35. Types of Usability Evaluation • Expert-based • User-based

  36. Expert Reviews • Rely on in-house experts or consultants • Many different varieties... • Heuristic • Consistency inspection • Cognitive walkthrough • Who is the “expert”? • System developer/designer?

  37. Nielsen’s (1994) heuristics • Visible system status • Match between system and real-world • User control and freedom • Consistency and standards • Error prevention • Recognition over recall • Flexibility and efficiency • Aesthetic and minimalist design • Help users recognize and recover from error • Help and documentation

  38. Nielsen (1993) Usability Engineering Problems Found 100% Task and Usability Expert Regular evaluator 75% Novice evaluators 50% 25% 0 5 10 15 Number of evaluators

  39. User-Based:Usability Testing/Laboratories • Indicates shift to focus on user needs • Usability testing can speed up process and save costs • Emergence of “usability labs”

  40. Usability Lab • Many varieties • User work area and separate observation area with mirrors, cameras, other data collection tools • Typically staffed by expert in testing, human factors, and/or user interface design • Work with designers to come up with usability evaluation plan

  41. Usability Test Plan • Methodology • Should include • Users (who, where, how many, etc.) • Tasks (what and why) • Measures (what and how) • Analysis procedures

  42. User Testing • Up-front objective, measurable goals that must be met before system is accepted • Might include… • Time to learn • Speed • Error rates • User testing (i.e., recall of commands) • Subjective satisfaction

  43. Summary: Usability Evaluation • Lots of different types (expert, user) • Approaches each bring strengths and problems • What is “usable” can be different based on who evaluates or what is evaluated • Complete picture: look at “3 Ps” • Power: functionality, what capabilities does the tool have? • Perception: user’s subjective ratings (satisfaction, ease of use), aesthetics, preferences • Performance: empirical data from actual use of the tool

  44. Summary • Forms and Reports are examples of user interfaces • Guidelines are useful in designing user interfaces • Take differences in users, tasks, technology, and environment into account when designing forms, reports, other user interfaces • Designs should be evaluated for usability

More Related