1 / 35

(1) Action Analysis (2) Automated Evaluation

(1) Action Analysis (2) Automated Evaluation. December 7, 2004. Hall of Fame or Hall of Shame?. Bryce 2 for building 3D models. Hall of Shame!. Icons all look similar what do they do???? How do you exit? Note nice visuals, but must be usable What if purely for entertainment?.

vgregg
Télécharger la présentation

(1) Action Analysis (2) Automated Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. (1) Action Analysis (2) Automated Evaluation December 7, 2004

  2. Hall of Fame or Hall of Shame? • Bryce 2 • for building 3D models User Interface Design, Prototyping, & Evaluation

  3. Hall of Shame! • Icons all look similar • what do they do???? • How do you exit? • Note • nice visuals, but must be usable • What if purely for entertainment? User Interface Design, Prototyping, & Evaluation

  4. (1) Action Analysis (2) Automated Evaluation December 7, 2004

  5. Outline • Review • Action analysis • GOMS? What’s that? • The G, O, M, & S of GOMS • How to do the analysis • Announcements • Automated evaluation tools User Interface Design, Prototyping, & Evaluation

  6. Review Toolkit Details • Models for images ? • strokes, pixels, regions • what is good about the stroke model? • saves space & computation, but can’t represent images well • what is aliasing & how do we fix it? • jaggies due to low resolution -> antialias (partially fill in adjacent pixels) • Clipping ? • drawing only regions that are visible to the user • Windowing systems • special problem with networked WS? • latency • Input events, such as • keyboard, mouse, window, etc. • Main event loop • used to dispatch events • Interactor trees used for • figuring out where to dispatch events • Dispatching events • two main ways… • Event focus determines • what widget current events go to User Interface Design, Prototyping, & Evaluation

  7. Action Analysis Predicts Performance • Cognitive model • model some aspect of human understanding, knowledge, intentions, or processing • two types • competence • predict behavior sequences • performance • predict performance, but limited to routine behavior • Action analysis uses performance model to analyze goals & tasks • generally done hierarchically (similar to TA) User Interface Design, Prototyping, & Evaluation

  8. GOMS – Most Popular Action Analysis • Family of UI modeling techniques • based on Model Human Processor • GOMS stands for (?) • Goals • Operators • Methods • Selection rules • Input: detailed description of UI/task(s) • Output: qualitative & quantitative measures User Interface Design, Prototyping, & Evaluation

  9. Quick Example • Goal (the big picture) • go from hotel to the airport • Methods (or subgoals)? • walk, take bus, take taxi, rent car, take train • Operators (or specific actions) • locate bus stop; wait for bus; get on the bus;... • Selection rules (choosing among methods)? • Example: Walking is cheaper, but tiring and slow • Example: Taking a bus is complicated abroad User Interface Design, Prototyping, & Evaluation

  10. GOMS Output • Execution time • add up times from operators • assumes experts (mastered the tasks) • error free behavior • very good rank ordering • absolute accuracy ~10-20% • Procedure learning time (NGOMSL only) • accurate for relative comparison only • doesn’t include time for learning domain knowledge User Interface Design, Prototyping, & Evaluation

  11. Using GOMS Output • Ensure frequent goals achieved quickly • Making hierarchy is often the value • functionality coverage & consistency • does UI contain needed functions? • consistency: are similar tasks performed similarly? • operator sequence • in what order are individual operations done? User Interface Design, Prototyping, & Evaluation

  12. Comparative Example - DOS • Goal: Delete a File • Method for accomplishing goal of deleting a file • retrieve from Long term memory that command verb is “del” • think of directory name & file name and make it the first listed parameter • accomplish goal of entering & executing command • return with goal accomplished User Interface Design, Prototyping, & Evaluation

  13. Comparative Example - Mac • Goal: Delete a File • Method for accomplishing goal of deleting a file • find file icon • accomplish goal of dragging file to trash • Return with goal accomplished User Interface Design, Prototyping, & Evaluation

  14. Applications of GOMS • Compare different UI designs • Profiling (time) • Building a help system • modeling makes user tasks & goals explicit • can suggest questions users will ask & the answers User Interface Design, Prototyping, & Evaluation

  15. Tradeoffs of Using GOMS • Advantages • gives qualitative & quantitative measures • less work than user study • easy to modify when UI is revised • Disadvantages • takes lots of time, skill, & effort • research: tools to aid modeling process • only works for goal-directed tasks • not problem solving or creative tasks (design) • assumes tasks performed by expertsw/oerror • does not address several UI issues, • readability, memorability of icons, commands User Interface Design, Prototyping, & Evaluation

  16. Announcements • Make sure your web sites are up to date • I scanned last night and saw lots of material missing • PowerPoint slides, all assignments, mailto link for team! • Use Design Patterns to guide your design • Make sure all links work & are on OUR disk space (we will archive) • We will start grading these after the final • Write-up for user testing assignment due by 5 PM on Friday evening (online & at Richard’s or Kate’s office) • Final presentations • Guggenheim 217 • 22 registered industry/UW guests • Dress appropriately • Bring a resume if looking for a job • Summer or permanent • Give demos after everyone has presented • I’ll supply lunch if you can hang around from 12-1 • Questions???? User Interface Design, Prototyping, & Evaluation

  17. Rapid Iterative Design is the Best Practice for Creating Good UIs We have seen how computer-based tools can improve the Design (e.g., Denim) & Prototyping (e.g., VB) phases Design Prototyping Evaluation User Interface Design, Prototyping, & Evaluation

  18. Online, Remote Usability Testing • Use web to carry out usability evaluations • Main approach is emote usability testing • e.g., NetRaker (now KeyNote WebEffective) • combines usability testing + market research techniques • automatic logging & some analysis of usage User Interface Design, Prototyping, & Evaluation

  19. Remote Usability Testing • Move usability testing online • research participants access “lab” via web • answer questions & complete tasks in “survey” • system records actions or screens for playback • can test many users & tasks -> good coverage • Analyze data in aggregate or individually • find general problem areas • use average task times or completion rates • playback individual sessions • focus on problems w/ traditional usability testing User Interface Design, Prototyping, & Evaluation

  20. NetRaker: Web Experience Evaluation • NetRaker Index • short pop-up survey shown to 1 in n visitors • on-going tracking & evaluation data • NetRaker Experience Evaluator • surveys & task testing • records clickstreams as well • invite delivered through email, links, or pop-ups • NetRaker Experience Recording • captures “video” of remote participants screen • indexed by survey data or task performance User Interface Design, Prototyping, & Evaluation

  21. NetRaker Index: On-going customer intelligence gathering • Small number of rotated questions increases response rate User Interface Design, Prototyping, & Evaluation

  22. NetRaker Index: On-going customer intelligence gathering • Small number of rotated questions increases response rate User Interface Design, Prototyping, & Evaluation

  23. NetRaker Index: On-going customer intelligence gathering • Increasing these indices (e.g., retention) moderately (5%) leads to a large increase in revenue growth User Interface Design, Prototyping, & Evaluation

  24. NetRaker Experience Evaluator:See how customers accomplish real tasks on site User Interface Design, Prototyping, & Evaluation

  25. NetRaker Usability Research:See how customers accomplish real tasks on site User Interface Design, Prototyping, & Evaluation

  26. NetRaker Experience Evaluator:See how customers accomplish real tasks on site User Interface Design, Prototyping, & Evaluation

  27. WebQuilt: Visual Analysis • Goals • link page elements to user actions • identify behavior/nav. patterns • highlight potential problems areas • Solution • interactive graph based on web content • nodes represent web pages • edges represent aggregate traffic between pages • designers can indicate expected paths • color code common usability interests • filtering to show only target participants • use zooming for analyzing data at varying granularity User Interface Design, Prototyping, & Evaluation

  28. User Interface Design, Prototyping, & Evaluation

  29. User Interface Design, Prototyping, & Evaluation

  30. User Interface Design, Prototyping, & Evaluation

  31. Advantages of Remote Usability Testing • Fast • can set up research in 3-4 hours • get results in 24 hours • More accurate • can run with large sample sizes • 50-200 users -> reliable bottom-line data (stat. sig.) • uses real people (customers) performing tasks • natural environment (home/work/machine) • Easy-to-use • templates make setting up easy for non-specialists • Can compare with competitors • indexed to national norms User Interface Design, Prototyping, & Evaluation

  32. Disadvantages of Remote Usability • Miss observational feedback • facial expressions • verbal feedback (critical incidents) • can replace some of this w/ phone & chat • Need to involve human participants • costs money (typically $20-$50/person) User Interface Design, Prototyping, & Evaluation

  33. Summary • GOMS • provides info about important UI properties • doesn’t tell you everything you want to know about UI • only gives performance for expert behavior • hard to create model, but still easier than user testing • changing later is much less work than initial generation • Automated usability • faster than traditional techniques • can involve more participants -> convincing data • easier to do comparisons across sites • tradeoff with losing observational data User Interface Design, Prototyping, & Evaluation

  34. Next Time • Final presentations • Guggenheim 217 User Interface Design, Prototyping, & Evaluation

More Related