1 / 52

Asking Users and Experts

Asking Users and Experts. Yujia ZHU Yimeng DOU. Asking Users. Interviews Questionnaires. Asking Users --- Interview. Interviews can be thought of as a “conversation with a purpose” (Kahn and Cannell, 1957)

Télécharger la présentation

Asking Users and Experts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Asking Users and Experts Yujia ZHU Yimeng DOU

  2. Asking Users • Interviews • Questionnaires

  3. Asking Users --- Interview Interviews can be thought of as a “conversation with a purpose” (Kahn and Cannell, 1957) How much the interview is like an ordinary conversation depends on the questions to be answered and the type of interview to be used.

  4. Guideline to develop questions Plan to keep the questions short, straightforward and avoid to ask too many • Avoid long questions • Avoid compound sentences • Avoid using jargon • Avoid leading questions like “Why do you like it?” • Be alert to unconscious biases

  5. Guideline to plan an interview Try to make the interview as pleasant for interviewees as possible and make the interviewee feel comfortable • Introduction • Warmup session • Main session • Cool-off period • Closing session

  6. Conduct Interviews The golden rule is to be professional • Dress in a similar way to the interviewees as possible, dress neatly and avoid standing out • Prepare an informed consent form, signature • Make sure your recording equipment works • Record answers exactly

  7. 4 types of interviews • Open-ended or unstructured • Structured • Semi-structured • Group interviews

  8. Unstructured Interview • Feature: • Interviewers have less control on the process • Open (The format and content of answers is not predefined) • Benefit: • Generate rich data • Disadvantage: • Data is very time-consuming and difficult to analyze • Impossible to replicate the process

  9. Structured Interview • Feature • The interviewers have more control • Typically, the questions are closed • To work best: • Questions need to be short and clearly worded • Questions should be refined by asking another evaluator to review the questions and run a small pilot study • Choosing type of interviews according to the evaluation goals and questions to ask

  10. Semi-Structured Interviews • Feature • Combine features of structured and unstructured interviews. • The interviewers start with preplanned questions and then probes the interviewee to say more • Some ways to improve the interview • Neutral Probes are a device for getting more information • Prompt the person to help him/herself along • Accommodate silences • Probing and prompting should aim to help the interview WITHOUTintroducing biases

  11. Group Interviews • Focus Group: Normally three to 10 people are involved. • Benefit: • It allow diverse or sensitive issues to be raised • High validity • Low-cost, quick results, easily be scaled • Disadvantage: • Need a skillful facilitator • Difficult to get people together in a suitable location and time

  12. Data analysis and interpretation in Interviews • Analysis of unstructured interviews can be time-consuming, though their contents can be rich • Data from structured interviews is usually analyzed quantitatively

  13. Asking Users - Questionnaires • A well established technique for collecting demographic data and users’ opinion • Questionnaire vs. Interview • Questionnaire can be distributed to a large number of people • Interview are easy and quick to conduct

  14. Designing questionnaires • basic demographic information General, specific questions • Advices for designing a questionnaire • Make question clear and specific • Ask closed questions and offer a range of answers • Include a “no-opinion” option • Ordering of questions • Avoid complex multiple questions • Provide appropriate range • Ordering of scales should be intuitive and consistent • Avoid jargon • Clear instructions • Balance between using white space and the need to keep the questionnaire as compact as possible

  15. Question and respond format • Different types of responses • Discrete responses (“Yes” or “No”) • Locate the answer within a range • A single preferred opinion • Commonly used formats • Check boxes and ranges • Appropriate range • Rating scales • Likert • Semantic differential scales

  16. Likert Scales • Used for measuring opinions, attitudes, and beliefs • Widely used for evaluating user satisfaction with products • Example: The use of color is excellent: Strongly agree agree OK disagree strongly disagree

  17. Designing Likert Scales • Gather a pool of short statements about the features of the product that are to be evaluated • Divide the items into groups with about the same number of positive and negative statements in each group • Decide on the scale • Select items for the final questionnaire and reword as necessary to make them clear

  18. Semantic differential scales • Explore a range of bipolar attitudes about a particular item • Each pair of attitudes is represented as a pair of adjectives • Example: (evaluation for a homepage) Attractive Ugly Clear Confusing Dull Colorful Exciting Boring Annoying Pleasing …… ……

  19. Administering questionnaires • Two important issues • Reaching a representative sample of participants • Ensuring a reasonable response rate • Some ways to encourage a good response • Ensuring the questionnaire is well designed • Providing a short overview section • Including a stamped, self-addressed envelope for its return • Explaining why you need the questionnaire to be completed and assuring anonymity • Contacting respondents through a follow-up letter, phone call or email • Offering incentives such as payments

  20. Online questionnaires • Advantages: • Quick responses • Low-cost to copying and postage • Immediate transferring data • Short time to require data for analysis • Easily correct error in questionnaire design • Two types: Email vs. Web-based • Email: target specific users • Web-based: more flexible, can use check boxes,pull-down and pop-up menus, help screens, and graphics • Problem: • Obtaining a random sample • Response rates may be lower

  21. Developing a web-based questionnaire • 3 steps • Designing it on paper, following the general guidelines • Developing strategies for reaching the target population • Turning the paper version into a web-based version • Produce an error-free interactive electronic version from the paper version • Make the questionnaire accessible from all common browsers and readable from different-size monitors and different network locations • Make sure information identifying each respondent will be captured and stored confidentially • User-test the survey with pilot studies before distributing

  22. Analyzing questionnaire data • Identify any trends and patterns • Use a spreadsheet like excel to hold the data • Simple statistics • Number or percentage of responses in a particular category • Bar charts can also be used to display data graphically • More advanced statistical techniques such as cluster analysis

  23. Asking Experts When users are not accessible, or involving them is too expensive,we can ask experts or combination of experts and users to provide feedback.

  24. Various Inspection Techniques • Heuristic evaluations • Walkthroughs Experts inspect the human-computer interface and predict problems users would have when interacting with it.

  25. Advantages • Relatively inexpensive • Easy to learn • Effective • Can be used at any stage of a design project **Usually when using heuristic evaluation, five evaluators can identify around 75% of the total usability problems.

  26. Heuristic Evaluation • Developed by Jakob Nielsen and colleagues • It’s an informal usability inspection technique. • Experts are guided by a set of usability principles. • These principles are known as heuristics. Experts evaluate whether user-interface elements conform to those principles.

  27. Heuristics (1) • Visibility of system status • Match between system and the real world • User control and freedom • Consistency and standards • Help users recognize, diagnose, and recover from errors

  28. Heuristics (2) • Error prevention • Recognition rather than recall • Flexibility and efficiency of use • Aesthetic and minimalist design • Help and documentation

  29. Different Heuristics For Specific Purposes • Core heuristics are too general • Following you will see an example of a set of heuristics for website. • There are also heuristics for evaluating toys, WAP devices, online communities, wearable computers, etc. • These heuristics are developed by tailoring Nielson’s heuristics and market research, etc.

  30. HOMERUN—Heuristics For Commercial Websites • High-quality content • Often updated • Minimal download time • Easy of use • Relevant to user’s needs • Unique to the online medium • Netcentric corporate culture

  31. How To Do Heuristic Evaluation • Briefing Session • Evaluation Period • Debriefing Session

  32. How To Do Heuristic Evaluation • Tell experts what to do.Use a prepared script as a guide. • Evaluation Period • Debriefing Session

  33. How To Do Heuristic Evaluation • Briefing Session • Two passes: 1st pass, gain some feeling of the system. 2nd pass, focus on specific interface elements. • Debriefing Session

  34. How To Do Heuristic Evaluation • Briefing Session • Evaluation Period • Experts come together to discuss their findings and to prioritize the problems found and suggest solutions.

  35. Heuristic Evaluation • Selecting appropriate heuristics is very important • Because users are not involved, there are fewer practical and ethical issues. • A week is often cited as the time needed to train experts to be evaluators

  36. Dilemma: Problems or False Alarms? • Different approaches often identify different problems. Sometimes heuristic evaluation misses severe problems. • About 33% reported problems are real usability problems. Heuristic evaluation misses about 21% of user’s problems. • 43% are not problems at all.

  37. How to reduce the number of false alarms and missed problems? • Use complementary user testing techniques along with heuristic evaluation. • Check if experts really have the expertise that they claim. • Have several evaluators to avoid one person’s bias or poor performance.

  38. Heuristic Evaluation Of Websites • In 1999, usability consultant Keith Cogdill was commissioned by NLM to evaluate MEDLINEplus. He identified seven heuristics. • These heuristics are given to 3 experts who independently evaluated MEDLINEplus.

  39. Internal Consistency Simple Dialog Shortcuts Minimizing Memory Load Preventing Errors Feedback Internal locus of control Heuristics used by Cogdill *What heuristics would we use to analyze ICS website?

  40. Layout: Uncomplicated Vertical Design; well suited for printing; conservative using of graphics Internal Consistency: Formatting of pages and logo is consistent across the website. Arrangement of health topics: Problems should be arranged alphabetically as well as in categories. Depth of navigation menu: increase the fan-out in navigation menu in the left margin. Results Of The Study

  41. Navigation Avoid orphan pages Avoid excessive white space resulting in long page Provide navigation support Avoid narrow, deep or hierarchical menus Avoid non-standard link colors Provide consistent look and feel for navigation Access Avoid complex URLs Avoid long download times Information Design content comprehension and aesthetics Summary of Heuristics for Web Design

  42. Heuristics For… • For online communities • Sociability • Usability • For other devices (handhelds, computerized toy)

  43. Another Technique: Walkthroughs • Cognitive Walkthroughs • Pluralistic Walkthroughs

  44. Cognitive Walkthroughs—Definition • Simulating user’s problem-solving process at each step in the human-computer dialog • Checking to see if the user’s goals and memory for actions can be assumed to lead to the next correct action • They focus on evaluating designs for case of learning

  45. Cognitive Walkthroughs—Steps (1-3) • Identify characteristics of typical users; Develop sample tasks; Produce the interface’s prototype, or a description of it; Generate a clear sequence of the actions needed for the users to complete the task • A designer and one or more experts begin the analysis • Evaluators walk through the sequences for each task, and try to answer the following questions: Will users know what to do, see how to do it, and understand from feedback whether the action was correct or not?

  46. Cognitive Walkthroughs—Steps (4-5) 4. Record critical information, including: • Assumptions about what would cause problems and why • Notes about side issues and design changes • A summary of the results 5. Revise the design according to the results.

  47. Example: Finding a book at Amazon.com • Let’s walk through the process of finding a book at • Task: find a book at • Typical user: students who use web regularly • Specific Steps, Questions and Answers

  48. Pros and Cons of Cognitive Walkthroughs • It takes longer than heuristic evaluation for the same part, because it examines each step of a task. • You may get much more detailed information from the cognitive walkthrough. • It’s useful to examine small part of a system; whereas heuristic evaluation is useful for examining a whole system.

  49. Pluralistic Walkthroughs--Definition Another type of walkthrough in which users, developers and usability experts work together to step through a scenario, discussing usability issues associated with dialog elements involved in the scenario steps.

  50. Pluralistic Walkthroughs--Steps • Develop scenarios in the form of a series (usually 1 or 2) of hard-copy screens representing a single path through the interface. • A panelist ask evaluators to write down the sequence of actions they take to move from one screen to another. • Discuss the actions from that round of review. (Users-> Experts-> Designers) • Move on to next round of screens. The process continues until all the scenarios have been evaluated.

More Related