1 / 26

Topics for today: April 7, 2004

Final exam: Wed April 21, 8:00 a.m., 424 HA. Topics for today: April 7, 2004. Finish discussion of error recovery/documentation from last week Ratner Ch. 9 – study of improved error messages Ratner Ch. 15 – study of “live help” systems Finish discussion of usability life cycle from Monday

Télécharger la présentation

Topics for today: April 7, 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Final exam: Wed April 21, 8:00 a.m., 424 HA Topics for today: April 7, 2004 • Finish discussion of error recovery/documentation from last week • Ratner Ch. 9 – study of improved error messages • Ratner Ch. 15 – study of “live help” systems • Finish discussion of usability life cycle from Monday • Ratner Ch. 4 – methodology for cost/benefit analysis of usability engineering activities

  2. Documentation Guidelines- Organization • State the educational objectives of each section • Introduce concepts in a logical order of increasing difficulty • After each “chunk” of material (7 + or - 2 concepts): Provide a “walkthrough” example showing how the concepts are used. • Avoid forward references

  3. Documentation Guidelines - Appearance & Style • CONSISTENCY - Develop written guidelines for consistent organization, style, and appearance • READABILITY - Use white space and text-organizing conventions to avoid large text blocks. (use headings/subheadings, bullet lists, short paragraphs) • SIMPLICITY - Use simple writing style, even if users are well-educated (users are engaged in many tasks at once)

  4. Tutorial Material • Should describe capabilities at task/functional level. • Should describe capabilities in an action-oriented way. • Use a conceptual model (OAI model) to structure explanations • Start by explaining the task model objects, from the highest level down to “atomic” elements. • Then explain the task model actions, from user’s goals down to specific action steps. • Once user understands the task objects and actions, then show the interface model objects and the mechanisms or command syntax needed to accomplish tasks • Finally, describe shortcuts

  5. Object/Action Interface Model: (Schneiderman, Sec. 2.3) Domain information System Tasks Visible objects User Operations Visible symbols Physical actions Program objects Steps Information design stage Mechanism/visual design Objects Actions Objects Actions Task Model Interface Model

  6. Creating Good Documentation - Summary • Good: • Progressive approach • Task-oriented examples • Readable explanations • Bad: • Complete specification presented in one text block • Abstract formal notations • Terse technical prose or complex prose style

  7. On-line Help • Pro’s: • It’s there whenever you need it. • Can be updated at low cost • Enhanced by string search, indexes, TOC, bookmarks, hypertext links • Use of color, sound, animation • Con’s: • Readability may be less than printed manuals • Presents another user interface to master • Blocks user’s view of workspace

  8. Reading from Paper v. Displays • Studies through 1980’s showed performance disadvantages in • reading from display screens -- about 30% slower task times, • slightly lower accuracy. • Readability issues: • screen size (frequent paging) • placement (looking down is better, rigid posture) • contrast, flicker, resolution, curved display surface • fonts, layout, formatting • Other issues: health concerns, fatigue, and stress • But: Later studies showed no difference with better quality display.

  9. Context-sensitive on-line Help • For part of program that is active • For a selected object: • using function key (F1) • Balloon help • Prompts for fill-in fields New approaches for on-line help FAQs Networked human help available Help desk User discussion groups/Newsgroups

  10. Four empirical studies • Error messages • Live help systems • Eye-hand coordination • Scent of the Web (searching for information)

  11. Advice on reading empirical studies • What question or issue is being investigated? • Describe the experimental methodology • What was the set-up (HW/SW)? • What were subjects asked to do? • How were the data analyzed? • What conclusions were drawn? • What additional questions do you have about themethodology? • What were the strengths and weaknesses of the study? • Do you think the conclusions were justified (why?)

  12. Revising error messages Background review: Norman 3 ways to approach errors Norman 3 kinds of errors Schneiderman 3 attributes of good error messages

  13. Revising error messages Background review: Norman 3 ways to approach errors minimize root causes reversible actions easy to discover errors and clear how to correct Norman 3 kinds of errors slip mistake situational Schneiderman 3 attributes of good error messages positive tone specific constructive <non-anthropomorphic>

  14. Revising error messages (cont.) • What question or issue is being investigated? • Describe the experimental methodology • What was the set-up (HW/SW)? • What were subjects asked to do? • How were the data analyzed? • What conclusions were drawn? • What additional questions do you have about themethodology? • What were the strengths and weaknesses of the study? • Do you think the conclusions were justified (why?)

  15. Live Help System • Interaction Elements: • Knowledge base of FAQ items • Continuously updated by assistants • User types NL question, matched to FAQ’s • Chat interface interacts with human assistants • If retrieved FAQ’s do not satisfy user • Feedback on availability of assistants • Feedback on your assistant’s “state” • Dialog history • Text entry area • User model displayed to assistant

  16. Usability Testing of Live Help System • Methodology: field study using Elfwood • Issues to investigate: • Impact on user attitudes, especially trust • Quality of support • Quality of assistant work situation Assistants and users volunteered Evaluation by questionnaires (2 for users, 1 for assistants)

  17. Usability Testing of Live Help System (cont.) Group 1 – users who interacted with assistants questions to evaluate efficiency questions to evaluate attitude Group 2 – users who did not interact with assistants why? – 15 % were satisfied w/FAQ 38% “just browsing” 24% could not get the system to work 29% no assistants available then Group 3 – volunteer assistants

  18. Design implications for future Live Help System • Emphasize the availability of live help, since users don’t expect it. • Make initiation process very easy • Do not use platform-dependent software (Java applet) • Make availability hours clear for getting human help • Provide queuing status • Provide call-back option • Use visual and audio alert when help becomes available • Consider email or voice options

  19. Cost-justifying usability Applying traditional cost-benefit analysis to Web UE projects Context: Complex Web apps vs. simple content-only sites Development time and cost approaching other software projects Surveys show ease of use is critical to Web success Some benefit categories for Web sites increased buy-to-look ratios (e-commerce model) increased number of visitors (advertising model) decreased cost of other customer service channels decreased user training cost (internal KM model)

  20. Cost-justifying usability (cont.) • Steps in the methodology: • Start with the UE plan • Establish analysis parameters • Estimate the cost of each lifecycle task in the plan • Select relevant benefit categories • Estimate monthly benefits • Compare cost to benefits • Benefits per month • One-time cost • Payback period

  21. Cost-justifying usability (cont.) • Usability Engineering Plan - activities • I.User profile • I.Task analysis (problem scenarios) • I.Usability goal setting • II.Information architecture (activity scenarios) • II.Conceptual design (information scenarios) • II. Paper prototype development • II. Usability test • III. Coordinated mechanism and screen design • (interaction scenarios) • III. Document design standards • IV. Live prototype development • V. Usability test • VI. Complete user interface design/prototype • Usability test

  22. Compare with Nielsen Usability Life Cycle – 7 Stages • I. Preliminary analysis • Know the user • user characteristics • users’ current and desired tasks • functional analysis • co-evolution of tasks and artifacts • Competitive analysis (automated and non-automated alternatives) • Setting usability goals • financial impact analysis

  23. Usability Life Cycle (cont.) • II. Early design • Parallel design • Participatory design • Domain experts (get used up) • Paper mock-ups or sample screens (not system specs!) • III. Middle Design • Coordinated design of the total interface • Apply guidelines and heuristic analysis

  24. Usability Life Cycle (cont.) IV. Implemented design Prototyping/scenarios (storyboarding) V. Empirical testing VI. Iterative Design Solution may or may not help Database (hypertext) of design rationale VII. Studying usability in the field

  25. Cost-justifying usability (cont.) Usability Engineering Plan – cost components Usability engineer hours Developer hours User hours Equipment

  26. Cost-justifying usability (cont.) Goals of this activity: win funding for UE plan appropriate UE programs Discussion of Web statistics and their limitations number of visitors v. how many were satisfied how many bought v. how many did not buy how many customer support calls processed v. how many customer problems resolved Better data would lead to after-the-fact validation and greater credibility in the future

More Related