1 / 31

SURVEY DESIGN General Principles

SURVEY DESIGN General Principles. Juan Paulo Ram í rez 8 May 2008. Objective, Goal and Motivation. Objectives: Present the basic visual elements to consider when building a survey. Goal: Improve measurement through an adequate visual design of surveys. Motivation:

alima
Télécharger la présentation

SURVEY DESIGN General Principles

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SURVEY DESIGNGeneral Principles Juan Paulo Ramírez 8 May 2008

  2. Objective, Goal and Motivation • Objectives: • Present the basic visual elements to consider when building a survey. • Goal: • Improve measurement through an adequate visual design of surveys. • Motivation: • Attended Don Dillman (University of Washington State) and Jolene Smyth (UN-L) presentations.

  3. Overview • Response error vs. response rate • Tips and tricks about surveys • Conclusions

  4. We apply surveys to: • “Estimate the distribution of a characteristic for an entire population by surveying only a fraction of that population.” (Smyth, 2008a) a Presentation: Using Visual Design to Improve Measurement in Web Surveys. Survey Research and Methodology & Sociology. April, 2008.

  5. Processing a question • The act of responding to each survey question is comprised into four activities: • 1. Understanding the question • 2. Retrieving important information from memory • 3. Deciding on an appropriate answer • 4. Reporting that answer

  6. Requirements • Those who answer the question have to represent the entire population. • The question(s) has to accurately measure the construct of interest.

  7. If we fail doing that, we can encounter the following errors • Sources of errors: • 1. Sampling (non-randomized) • 2. Non-coverage (some members are not covered) • 3. Measurement (respondent provides inaccurate information) • 4. Non response: • Some of the members of the sample population do not respond to the survey questions (diff. than response rate).

  8. Things to consider • A low response rate does not necessarily entail non-response error. • The general assumption is that the higher the response rate the lower the potential of non-response error and therefore the better the survey. But both concepts are not necessarily connected. Source: Dillman, D. (1991). The Design and Administration of Mail Surveys. Annu. Rev. Sociol., 17:225-49. This slide from page 229.

  9. Did not make any difference Increasing response rate • Extensive literature on this matter (i.e. Dillman’s TDM): • Follow-up mailing seems to be the most effective way to increase response rate. • Financial incentives • Prior notice • Special postage • Stamps on the return envelope • Personalization • Questionnaire color • Promise of anonymity • Deadline date • Nature of the cover letter • Questionnaire length

  10. Dillman’s recipe • Total Design Method (TDM) • Ordering questions (most important first) • Use of graphical design + question writing principles (consistency in the use of large letters and contrasting small letters) • Printing the questionnaire in a booklet format with interesting cover • Four carefully spaced mailings (postcard follow up one week after the original mailing; a replacement questionnaire and cover letter seven weeks after the first mailing-certified mail; individually printed, addressed, and signed letters; addresses printed onto envelopes rather than on address labels...)

  11. Mail Surveys vs. Interviews • Research has found that on average the response rate for mail surveys is only 7.5 – 8% lower than face to face interviews. Source: Dillman, D. (1991). The Design and Administration of Mail Surveys. Annu. Rev. Sociol., 17:225-49. This slide from page 238.

  12. Increasing response rate • Effect of graphic cover Source: Dillman, D. (1991). The Design and Administration of Mail Surveys. Annu. Rev. Sociol., 17:225-49. This slide from page 235.

  13. 72.3% Significantly lower Item nonresponse Financial Incentives Cash prepayment $5 Cash post payment $10 No financial incentive Response rate 82.6% 71.3% Source: Dillman, D. (1991). The Design and Administration of Mail Surveys. Annu. Rev. Sociol., 17:225-49. This slide from page 236.

  14. Theoretical Maximum Response Asymptote • Some populations 100%, others 80-85% response rate. • 10-20% within theoretical maximum.

  15. Visual Design • Different visual layouts produce different effects on respondent answers.

  16. Nose Eyes Pattern Recognition • Two levels: • 1) Preattentive processing: 180° visual field Adapted from: faculty.washington.edu/chudler/gif/eyetrtop.gif

  17. I can read between 8-10 characters 2° Pattern Recognition 2) Attentive processing: Foveal View

  18. Cognitive interviews have suggested that respondents frequently fail to see the verbal branching instruction in this location.... Branching Instructions Why? 1. Which of the following best describes you? □ I tend to think before I act □ I tend to act before I think - Skip to 3 Note that the branching instruction is the same size, shape, color and brightness as the rest of the text.

  19. What to do? • Visual search tasks have demonstrated that a target item can be located more rapidly if it is made visually dissimilar from the non-target items.

  20. Skip to 3 I tend to act before I think - Different options for branching instructions 1. Which of the following best describes you? □ I tend to think before I act □ I tend to act before I think - Skip to 3 □ I tend to act before I think → Skip to 3 • Increase • contrast ratio  Skip to 3 Skip to 3 Skip to 3 • Increase font size • Reverse printing • Colored background • Direction □

  21. Branching results: “-” or “” nn Lincoln, NE   90.8% 93.8% 8.8% 4.2%

  22. Question wording vs. Visual design • Much attention has been put in question wording, however the use of symbols, numbers, and graphics provides additional meaning to the respondent.

  23. 95.8% MM YYYY 87.2% MM YYYY 55.3% Month Year 45.4% Year Month Various month/year answer space layouts and the % of respondents reporting in the desired (mm/yyyy) format

  24. Conceptual vs. Visual Midpoint Is the federal government doing enough to ensure equal job opportunities for women? The response scale of the items had a clear conceptual midpoint: “About the right amount”. Version with no divider line Divider line version About the right amount Too little Nonsubstantive options Responses are displaced toward the visual midpoint.

  25. How would you rate the overall living conditions in downtown Lincoln, Nebraska? □ Excellent □ Very Good □ Good □ Fair □ Poor Linear How would you rate the overall living conditions in downtown Lincoln, Nebraska? □ Excellent □ Good □ Poor □Very Good □ Fair Triple Banked Double or triple-banking response choices Results: Significant changes of Respondents choosing Very Good and Good. Be aware of banking constructions on web-based surveys.

  26. Forced-choice vs. check-all On average 66 percent of check-all respondents spent at or below the mean response time. Source: Smyth et al. 2004. COMPARING CHECK-ALL AND FORCED-CHOICE QUESTION FORMATS IN WEB SURVEYS. Public Opinion Quarterly, Vol. 70, No. 1, Spring 2006, pp. 66–77

  27. Larger space for open ended questions • Respondents typically provide shorter, less complete answers to open-ended questions than when surveyed in interviews. • Fix: User larger space  provides greater number of words and themes.

  28. Lesson learned • Taken all together, it shows that visual layout and design make a big difference in how survey questions are understood and answered.

  29. A Bad Ballot Design May Cost a Presidential Election • “Palm Beach County Supervisor of Elections (....) said she put the presidential candidates on two pages to keep the print size big enough for the county's many elderly voters. She has acknowledged it was a mistake.” • “Many of the voters said they had expected Gore and Bush to be the first two choices, as Florida law requires. Instead, they found Buchanan, on the opposite page, between them.” Source: http://archives.cnn.com/2001/ALLPOLITICS/03/11/palmbeach.recount/

  30. You voted for Al Gore. Did you? Source: http://fury.com/galleries/palmbeach/index.php

  31. Conclusions • Survey questions consist of much more than words. • Respondents seem to expect items to be arrayed in a logical progression from left to right or from top to bottom. • Forced-choice format is a desirable alternative to the use of the check-all question format for multiple-answer question in Web surveys. • In open-ended questions, larger space produced a higher number of words and themes. • Progress indicators in a online survey works with short surveys.

More Related