1 / 54

Evaluating a product's usability and the broader user experience it offers

Evaluating a product's usability and the broader user experience it offers. For the Society for Technical Communication, Nov 21 2002. Agenda. Tell you how I conduct evaluations Jump right in to usability evaluation 4 types of evaluations Case projects

Télécharger la présentation

Evaluating a product's usability and the broader user experience it offers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating a product's usability and the broader user experience it offers For the Society for Technical Communication, Nov 21 2002 Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  2. Agenda • Tell you how I conduct evaluations • Jump right in to usability evaluation • 4 types of evaluations • Case projects • Bits and pieces related to conducting evaluations • Experience evaluation, focus groups • Discussion (throughout) Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  3. Instructional Systems Design (ISD) Based on learning and instructional theory Aims: improve human performance increase efficiency and effectiveness ensure the quality of instruction maximize the learning experience Human Computer Interaction (HCI) Based on human information processing theory (perceptual, cognitive, and motor) Aims: create successful interaction between people and computers maximize performance of human and computer together as a system Similar disciplines How do these common models compliment one another? • ADDIE ISD model • User-Centered Design model (UCD) • Software engineering lifecycle Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  4. Types of products • Paper-based • Computer-based • LAN-based • Internet-based • Electronic books on PDAs or information appliances Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  5. Usability Usability • A useable application allows the user to focus on the task at hand, not on the application • Reach usability • Match the way users work • Behave predictably • Support user’s cognition and perception skills • Usability evaluations • … are a confirmation or disputation of how well an application works for the users, not how well the users perform with the application • Results provide excellent input for design improvement Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  6. User experience Everything felt, observed, and learned through awareness and interaction with a company’s space, products, services, and communication. (HannaHodge) • Encompasses all potential user touch points • Application graphical user interfaces • Documentation • Training • Customer contact systems • Customer Touch Points Strategy • The heart of user experience design is reaching high user satisfaction AND strong human performance Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  7. Evaluation types in the software world • Common confusion around types of evaluation activities • Three broadly-defined types of evaluations • Specification Compliance - Does the software comply with the specification for the software? • Software Performance - Does the software meet business goals for operability and performance? • Usability and Experience - Does the software meet the needs and desires of the direct and indirect users? • Make sure your client understands the differences and has the right expectations Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  8. Reasons for evaluation • Evaluate before the users do .. they will sooner or later • Suggest improvements to the design • Confirm that the product meets the usability specifications • Confirm acceptability of interface and/or supporting materials • Ensure that it meets customer expectations • Compare alternative designs (depoliticise the comparison of designs) • Match or exceed usability of competitor’s products • Ensure that it complies with any statutory requirements such as ISO or accessibility Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  9. User needs drive requirements; huge impact if the user needs aren’t properly represented 1 User need 10 Features 100 Tech spec (Alan Cooper) Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  10. User-centered design and usability System-centred approach User-centred approach Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  11. Common complaints about usability evals • Insufficient audience sample • Providing sufficient rationale for judgements • Backing up your findings • Test and analysis rigor • “Ad hoc” about test structure • Communicating findings to IT, Marketing, shareholders sympathetically • Dogmatism and extremism Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  12. Some suggestions • Define test objectives with client • Define recruitment criteria with client • Add more structure to the evaluations • Keep track of task completion • Improve test moderation skills • Conduct more thorough analysis • Provide meaningful recommendations Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  13. Formative evaluations early stages of the development lifecycle iterative design refinement tend to be structured and informal, inexpensive, and rapid Summative evaluations conclusion of a development effort quality control and standards compliance tend to be formal, statistical, expensive, and time-consuming Usability as design and evaluation • Usability evaluation is commonly thought to be an evaluation of a product after it’s been developed • Very powerful in the design phase, e.g. “I design to support usability” • When you evaluate depends on the goals for evaluation and the state of the product Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  14. Different evaluations for different phases Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  15. Discuss 4 of these types • Usability walkthrough • Prototyping • Scenarios • Heuristic evaluation • Heuristics • Usability testing • Moderating the test • Test plans • Expert evaluation • Analysis Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  16. The Basics Early usability evaluation Paper prototype With users (1-6+) Facilitated by researcher Scenarios Conference room Case: Australia Wheat Board Online trading of wheat What’s needed Write scenarios Create prototype Make copies of prototype for each user Dry run the walkthrough During the evaluation Describe the process to participants Introduce the scenario Step through the scenario using the mockup Prompt for user feedback Instruct users to write down the actions they would take Record comments After the evaluation Analyze all of the comments Priorities the issues Make recommendations for prototype improvement Revise the prototype Usability walkthrough Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  17. Benefits Requirements capture Reveals problems/prevents gross mistakes Allows evaluation and discussion from designers and users Users feel involved Results in better usability Economical way of testing designs Stages of prototyping Paper-based (low fidelity) Sticky notes with labels arranged on a piece of paper Users write all over the prototype, move sticky notes around, draw pictures Printouts from PowerPoint mockup can be used, but use handwriting font Electronic mock-ups PowerPoint High fidelity In the delivery medium Prototyping Prototyping is central to design iteration and refinement; iterate 4-5 times Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  18. Writing scenarios • Narrative that is written by the researcher from information gained from SMEs and/or users • Scenarios describe: • Users and their goals • Work practice • Actions the user will take to accomplish goals • Responses from the product • Aspects • Paint a picture of the ideal usage • Keep them ‘technology agnostic’ Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  19. Scenario example • Suzanne and Greg are very excited about their move to Australia. They’ve never lived abroad and aren’t sure if they’ll be able to afford a nice place to rent. They decide to look on the Internet for places to rent to see what they can afford. • After they connect to The Age’s website, it is obvious where to find the rentals section. It is clear and easy to understand. • They find a house in a desirable area, but the house they have found is too expensive, so they look for other houses. • They are pleased to find a number of different types of houses. They select all of the houses they are interested in, and print those. • They decide to check out another newspaper’s listing. The experience there is frustrating by comparison. They cannot tell what neighborhoods the houses are in, and cannot tell where the neighborhoods are in relation to the CBD. • Frustrated with this site, Suzanne and Greg return to The Age’s website to continue exploring houses in other neighborhoods. They are pleased to see that the site remembers their selections. Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  20. The Basics Midpoint usability evaluation Electronic mockup Without users Heuristics (rule of thumb for good design) Evaluator(s) Private office Case: Australia Vinyl Public information website What’s needed Know the evaluation goals Design and validate the heuristics Have the list of the heuristics and their definitions in-hand Evaluator(s) Time During the evaluation The evaluator familiarizes with the product enough to know how to get around and what can be found Systematically go through each heuristic noting any violations against it Assess the macro and the micro elements Identify the element that violates the heuristic Describe how the element violates the heuristic After the evaluation Analyze all of the violations Make conclusions about the larger impact of the violations Make design recommendations that solve the problems caused by the violations Heuristic evaluation Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  21. Usability Heuristics for Software (easily repurposes for paper-based) Match between the system and the real world Consistency and standards Visibility of system status Error prevention Error recovery User control and freedom Visual feedback Aesthetic and minimalist design Recognition rather than recall Make the user smart Flexibility and efficiency of use Help and documentation Usability heuristics Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  22. Content Interaction and level of engagement Navigation and efficiency of use Orientation Presentation and visual integrity Structure and hierarchy of information Abbreviated heuristics Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  23. User experience heuristics • Provides a rewarding experience • Appropriately challenging • Not anxiety or phobia producing • Inviting; not intimidating • Fosters curiosity; is inspirational • Fosters a desire for accomplishment • Worthy of exploration • Browsing is rewarded • Supports the user's sense of style • Provides an interesting experience • Effectively manages distance between the author and user personas (the voice of the product versus the voice of the product as relates to the user) Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  24. The Basics Late usability evaluation Structured Coded application With users Test tasks Facilitator Researcher In the field or in the lab Case: HP Public website What’s needed Know the product’s purpose, evaluation goals, and target audience Decide the test environment Decide the location of the evaluator: next to user or in separate room Recruit users according to criteria Decide how much qualitative versus quantitative Create the test structure Decide on think-aloud protocol Design the user test tasks Create a test script (play-by-play) Review good communication practices for facilitating a test Dry run the test con’t … Usability testing Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  25. During the evaluation User is welcomed and fills out a permission to videotape/audiotape User pre-survey Moderator briefs the user and establishes rapport Introduces user to the test User performs tasks while thinking aloud Moderator keeps in close contact with user Moderator performs measurements and takes notes Moves user to next task Good moderation practices are followed User is thanked Notes are finalized, and user leaves Next user is welcomed After the evaluation Finalize all of the notes and artefacts Separate qualitative from quantitative Affinity diagram all of the findings Make conclusions about the larger impact of the issues Make design recommendations that solve the problems caused by the usability issues Usability testing Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  26. Approaches to usability evaluation Very rough !! • Traditional HCI Approach………....30 days, $40k+ • Formal method, lab coat, stopwatch, metrics-based • Discount Usability (Nielsen)….……2-7 days, $1k-8k • Also known as “Guerilla HCI”, qualitative • Structured Quick Approach…....7+ days, $15-20+k • qualitative and some quantitative Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  27. Guerilla Test broad test objectives 5 participants 1-1.5 hour sessions run with user activity scenario probe as-you-go qualitative only minimal noting Structured Test narrow test objectives 6-8+ participants 1.5-2 hour sessions run with appropriate test tasks conduct task benchmarking (no probing until the task is complete) qualitative and quantitative - capture times, success rate, etc. more questionnaires detailed noting Guerilla versus structured Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  28. Planning the test • Know your users • Know the market and domain • Craft test objectives • Identify the factors to record • Define product success criteria • Choose types of test tasks • Select test tasks • Construct test tasks • Choose the evaluation environment • Sample test structure • Conducting the test Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  29. Know your users • Expertise level (novice, intermittent, frequent) • Familiarity with specific hardware and software • Information access needs, e.g. summary level or detailed level • Information retrieval preferences, e.g. search, browse • Motor skill level with regard to delivery medium • General educational level • Domain knowledge and related skill level • Age, gender, other considerations Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  30. Know the market and the domain • Market information • Identify all the markets • Select the markets that are germane to the evaluation • Identify the types of people within those markets to participate in the evaluation • Domain knowledge • The domain, to a large extent, determines the context of use • Understand the domain and the human strategies and skills invoked by that domain Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  31. Crafting test objectives • Broad Objectives • “Let’s find all the problem spots” • Narrow Objectives • “Let’s identify what’s inhibiting the user’s productivity” • When creating objectives, define them to the most narrow or detailed level possible Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  32. Identify the factors to record • Speed of operation • Completion rate • Error free rate • Satisfaction rating • Advanced feature usage • Path analysis • Probing • Emotions • User suggestions Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  33. Define product success criteria Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  34. Choose type of test tasks • Atomized tasks • discrete, small • unlike user activity scenario tasks, these don’t make up a single story • Suitable for measurement and randomizing • Exploratory tasks • non-directed • to capture users’ initial reactions • User activity scenario tasks • tasks are suited to procedural applications • tasks “tell a story” • watch for this … uses often get lost in the detail, the user doesn’t identify with the scenario • “User-designed” tasks • tasks are designed by the users themselves • Systematic exploration of the site components • tasks test site components instead of user tasks Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  35. Selecting test tasks • Typical tasks - 3 • Tasks performed 80% of the time • Critical tasks - 2 • High priority tasks which may be very expensive to the company • Problematic tasks - 2 • Known trouble spots • Infrequent task - 1 • Tasks that occur infrequently, but which may determine user satisfaction or may be expensive to the company Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  36. Constructing test tasks • 2-4 sentences in length • Indicate typical context of use including • The characters and the situation • The environment where task is likely to be performed • Appropriate time pressures • Appropriate level of detail • “Sally’s is in a rush and suddenly realizes her mom’s birthday is fast approaching. She decides to send flowers. Mom loves gladiolas. Show me how Sally goes about sending her mom flowers.” • “Robert needs to install an updated driver for his printer. Show me how he does this.” Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  37. Evaluation environments • In the field • Immersed in context of use • Slightly removed from context of use • In the conference room • In the lab or focus group site • Researcher in room with participant • Researcher behind the one-way mirror • Remote evaluation • Tester and observers are not in the same locale Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  38. Note on usability labs Considerations • Methods and techniques • Equipment needs • Flexible and reconfigurable • Appropriate ambience: comfortable, warm, edgy environment Yellow – Test room Blue – Observation room SCHIL usability lab at Swinburne Uni, Melbourne Hiser’s Experience Lab, Melbourne IDEA Lab, Melbourne Uni, Melbourne Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  39. Note on the lab environment • Audio • Video • Document camera • Picture-in-picture • Microphones • Capture users’: • Facial expressions • Pen or mouse location (scan converter) Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  40. Sample test structure • Pre-test questionnaire (in the waiting area) • Greet participant • Explain test to participant • Optional start task • Participant takes 10 minutes to familiarize themselves with the product and provide first impressions • Run tasks: • Participant performs task • Moderator determines success of completion • After task has ended, probing • Optional task evaluation • Repeat • Post test interview • Post test questionnaire Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  41. Conducting the test • Observe and record user’s behaviour and comments • Think aloud protocol • Establishing rapport • Ask open-ended questions • Dealing with difficult personalities • Users blaming themselves • Judging task success/failure • Questioning users • Positive questions – all questions should be asked in a positive, supportive and non-threatening way • Open – encourage an individual to talk and provide maximum information • Closed – can be answered in a few words or sentences • Probing – usually to follow up on a response to ask for more details Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  42. Reminder Testing is about Observation … Witnessing • It’s important to keep chatting to an absolute minimum • The participant should be speaking at least 80% of the time • Use clear, direct language • Have a neutral, accepting style .. Not harsh and not too familiar Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  43. General ethics Consider users first Safeguard users' rights, interests, and sensitivities Communicate research objectives Protect the privacy of users Must not exploit users Make privacy policies available to users Specific practices Permission to audio/videotape Stated purpose of audio/videotapes Non-identifying artefacts Non-identifying results in report Ethical treatment of users Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  44. The Basics Late usability evaluation Application prototype of coded app Without users, but with personas User tasks in hand Design principles, rules, heuristics, experience Expert evaluator Private office Case: Scape Entertainment site with loyalty program attached What’s needed Know the evaluation goals Know the intent of the product and the user tasks that need to be supported Have a list of sample user tasks and sample user profiles in hand Evaluator(s) Time During the evaluation The evaluator familiarizes with the product enough to know how to get around and what can be found Systematically go through each user task and try to accomplish it Note the usability issues that are likely to frustrate users or prevent them from accomplishing tasks Record the issue, its impact to task accomplishment Expert evaluation Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  45. After the evaluation Analyze and categories all of the issues Roll up the issues to comment on the weaknesses/strengths of the product Provide an indication of which user tasks are most at risk of being violated Create a long issues table containing: Product component Usability issue Significance Severity Suggestions Expert evaluation Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  46. After the evaluation • Analysis and synthesis • Reporting results Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  47. Analysis and synthesis • Review observations or findings • Rate findings • Severity • Priority to fix • Frequency, impact, and persistence • Notice patterns • Draw conclusions about tendencies • Look for “root cause” of usability or satisfaction issues • Use analysis rooms and sticky notes Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  48. Reporting results Reporting results • “in their words” • Users’ artefacts • Testimonials • Highlights tapes • Likes as well as dislikes • Always point out the positive as well as the negative Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  49. Additional evaluation types • User Experience evaluation • Activity-based focus groups Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

  50. User Experience evaluation • Assessing the user experience strategy • Reviewing these strategies: • Positioning, brand, content strategies • Measuring: • Brand perception • Task completion • Error rates • Patterns of behaviour Case: Foxtel Call Centre and CRM apps Suzanne Currie, Usability and UI Design, suzanne_currie@indra.com

More Related