1 / 19

Evaluating survey questions

Evaluating survey questions. Survey Research and Design Spring 2006 Class #9 (Week 10). Today’s objectives. To answer questions you have To understand how to build scales To explore techniques used to evaluate survey questions To spend time applying information to group projects.

phillip
Télécharger la présentation

Evaluating survey questions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating survey questions Survey Research and Design Spring 2006 Class #9 (Week 10)

  2. Today’s objectives • To answer questions you have • To understand how to build scales • To explore techniques used to evaluate survey questions • To spend time applying information to group projects Survey Resarch and Design (Umbach)

  3. Total survey error Measurement Representation _ Y Target Population m1 Construct Coverage Error Validity _ yc Measurement Yi Sampling Frame Sampling Error Measurement Error _ ys Sample Response yi Nonresponse Error Processing Error _ yr Respondents Edited Response yip Adjustment Error Postsurvey Adjustments _ yrw Survey Statistic _ yprw Survey Resarch and Design (Umbach)

  4. Developing scales • Some fields emphasize scales; others are fine with single items • Why use a scale? • All of our survey data contain error; for example, a respondent might choose the wrong response category due to distractions • For a single item, this means error in the measure • Suppose we combine several related items by adding them together • Random measurement errors for a person’s responses should average out • Theoretical approach is that there is a latent construct that we can imperfectly measure with several survey items Survey Resarch and Design (Umbach)

  5. Developing scales • For the survey • Figure out what you want to measure • Develop a set of survey items that you think measure your construct • Guidelines are similar to regular questions • One difference: reversal in item polarity • Determine response format • Likert scale is generally the best choice • Have your item pool reviewed by “experts” • Best to pretest so you have data to analyze before the survey administration • With data • Examine correlation matrix; correlations should be positive • If not, check item polarity and recode Survey Resarch and Design (Umbach)

  6. Developing scales • Calculate Cronbach’s alpha • SPSS: Analyze, Scale, Reliability analysis; choose Statistics and check Scale if item deleted and Inter-item correlations • SAS: Proc Corr, include Alpha option • Drop items to increase alpha; not necessary to use all items • Alpha increases as • Inter-item correlations increase; i.e., items are similar • Number of items increase • Rule of thumb is alpha at least .70 or higher; .90 or higher is best • If you have several scales, okay to have one or two less than .70 • But be aware that this indicates your scale has quite a bit of noise Survey Resarch and Design (Umbach)

  7. Generate a large pool of items • Choose items that reflect the scale’s purpose. • Redundancy can be okay, particularly when generating an item pool, if items express a similar idea in somewhat different ways. • Start with a large item pool if possible. • Look to other instruments for help. • Begin writing questions. Survey Resarch and Design (Umbach)

  8. k=number of items in the scale =mean interitem correlation Survey Resarch and Design (Umbach)

  9. Group Projects: Begin writing questions • What questions you will ask? • Are there survey instruments that will inform your work? • Do you plan to build scales? Survey Resarch and Design (Umbach)

  10. Survey question standards • Groves et al. argue that all surveys should meet three standards • Content • Cognitive • Usability Survey Resarch and Design (Umbach)

  11. Ways to determine if questions meet these standards • Expert reviews • Subject matter experts and questionnaire design experts review questions • Sometimes they use a checklist of question problems (p. 243) • Can help assess all three standards • Focus groups • Group of 6-10 volunteers participate in a discussion guided by a moderator • Often done prior to developing a survey instrument • Intent is to gather information (e.g., terms, common language, perspectives on key issues) about the survey topic from the target population • Used to assess content and perhaps cognitive standards Survey Resarch and Design (Umbach)

  12. Ways to determine if questions meet these standards • Cognitive interviews • Generally done one-on-one • Interested in the cognitive processes of survey respondents • Participants may think aloud as they work through the survey, or researcher may ask questions to discover how respondents understand questions and arrived at answers • Some terms • Concurrent think-alouds • Retrospective think-alouds • Confidence ratings • Paraphrasing • Definitions • Probes • Used to assess cognitive and usability standards Survey Resarch and Design (Umbach)

  13. Ways to determine if questions meet these standards • Field pretests • Small scale rehearsal of data collection • Evaluate the instrument as well as sampling and collection procedures • Two types of information yielded • Interviewer debriefings (for in-person only) • Quantitative • Helps assess usability standards • Randomized or split-ballot experiments • Studies that compare different methods of collection, procedures, or versions of questions • Randomly assign sample members to control and experimental groups Survey Resarch and Design (Umbach)

  14. What should you do? • The discussions in the readings are “best-case” scenarios • So you will probably not be able to do what the national surveys do • However, you can use • Expert reviews (who?) • Focus groups (if necessary) • Cognitive interviewing with several members of your population • Don’t forget colleagues, friends and family members • Every additional pair of eyes is a good thing, especially for typos • They can also catch problems even if they are not very familiar with the topic, e.g., poor instructions Survey Resarch and Design (Umbach)

  15. Cognitive interviewing exercise • Choose who will be the interviewer, and spend 10 minutes doing a cognitive interview on the graduate student survey; then switch tasks and continue along the survey. • Be prepared to report back to the class what you find. Survey Resarch and Design (Umbach)

  16. Statistical estimates of measurement quality • Validity • The extent to which the survey measure accurately reflects the intended construct • The correlation between the response and true value • Can be estimated in two ways • Data external to the study • Multiple indicators of the same construct • Compare answers with other related questions • Check to see if differs across groups that should be different Survey Resarch and Design (Umbach)

  17. Statistical estimates of measurement quality • Response bias • Average difference between the response and true value • Can be estimated with • Split-ballot experiments • External individual data • External summary data • Reliability • Extent to which answers are consistent or stable across measurements • Ways to measure • Reinterviews of respondents • Internal data – Cronbach’s alpha Survey Resarch and Design (Umbach)

  18. Group projects • What steps do you plan to take to evaluate your survey questions? • Who? How? Survey Resarch and Design (Umbach)

  19. For next week… • Readings: • Dillman, Chapter 3 • Christian, L.M. and Dillman, D.A. (2004). The influence of graphical and symbolic language manipulations on responses to self-administered questions. Public Opinion Quarterly, 68(1): 57-80. • Tourangeau, R., Couper, M.P., & Conrad, F. (2004). Spacing, position, and order: Interpretive heuristics for visual features of survey questions. Public Opinion Quarterly, 68(3): 368-393. • Reminder: No class in TWO weeks. Use the time to work on group projects. Survey Resarch and Design (Umbach)

More Related