html5-img
1 / 36

Designing Surveys for Mobile Devices: Pocket-Sized Surveys That Yield Powerful Results

Designing Surveys for Mobile Devices: Pocket-Sized Surveys That Yield Powerful Results. Mario Callegaro , Tim Macer. Mobile Phone Penetration Up. Rules of Thumb. No horizontal scrolling Vertical scrolling OK Avoid long lists Especially in check all that apply Situation Fluid

hertz
Télécharger la présentation

Designing Surveys for Mobile Devices: Pocket-Sized Surveys That Yield Powerful Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing Surveys for Mobile Devices: Pocket-Sized Surveys That Yield Powerful Results Mario Callegaro, Tim Macer

  2. Mobile Phone Penetration Up

  3. Rules of Thumb • No horizontal scrolling • Vertical scrolling OK • Avoid long lists • Especially in check all that apply • Situation Fluid • As Tablet become Popular • Depends on the Platform

  4. Platform Considerations • Need to Test on Multiple Platforms • Apple (iPhone/iPad) can’t implement Flash • Default on all phones is to not enable Java

  5. Useful Paradata • UserAgentString • Device\Model • Operating System • Screen Resolution • Fonts

  6. “Can You See It Now? Good”Usability Testing of a Mobile Health Application Sarah Cook, Rita Sembajwe, Emily Geisen, Barbara Massoudi

  7. New Way to Do a Diary

  8. Benefits • Immediate Results • Cost Effective • Create Easy to Use Dashboard

  9. Usability Suggestions • Don’t scroll vertically on Select All • Make it easy to trace any sliding • Hard to video what they do

  10. Mobile Phone Effects at Event-Based Sampling Dan Williams

  11. Case Study

  12. Three Modes of Collection • Web • Most Popular • Not all on Mobile Device • IVR • Capture Older Population • SMS • Immediate Response • Younger Respondents

  13. Are you who you say you are? Using a Multisource Cross-validation Methodology for Panel Membership Information. Kumar Rao

  14. Real, Unique, and Engaged • 3rd Party Database Validation • Include Demographics • Use Multiple Databases

  15. Results • Cost could be worth the extra • All more likely to be established households • False Positives Too High • Still Important Part of Process

  16. Differential Sampling Based on Historical Individual-Level Data in Online Panels Richard Kelly

  17. Quota Sampling • Way to Deal with Non-Response • Didn’t Know Demographics • More Efficient to Screen Out • Just Transferred Over to Online

  18. Differential Sampling • Know the Demographics • Know the Response Rates • Oversample those Hard to Reach • More Efficient and Cost Effective

  19. Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats on Survey Reports and Data Quality Jennifer Dykema, Nora Cate Schaeffer, Jeremy Beach, Vicki Lein, and Brendan Day

  20. Three Types Web Designs • Check-List • More Items Selected • Check-All • Lower Break-offs • Stand Alone • Less Primacy Effect

  21. Category Selection Probing in Online Access Panels DorothéeBehr, Lars Kaczmirek, Michael Braun, Wolfgang Bandilla

  22. Cognitive Testing OE • Face-to-Face too Expensive • Online Testing • Probing Open Ends • Community vs Panel • More chatty?

  23. Results • Topic Trumps Source • Use Communities Built Around the Topic • Face-to-Face More Involved

  24. Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts Vera Toepoel

  25. Snowball Recruiting • No Online Panel in NL Representative • Requires More Commitment • Try Refer a Friend Program • Use Network Theory

  26. Results • Snow Never Rolled • Only got 120 recruits • Don’t Use Students • Incentives not Worth the Cost

  27. Representativeness

  28. The Use of Web Panels to Characterize Rare Conditions John Boyle

  29. Hard to Reach Population • Only 23 in a sample of 10,000 HH • Costs High • Variance Too High • Important Diseases

  30. Clean the Online Data • Certain Improbable Conditions • Speeders • Straightliners

  31. Results In Line • Prevalence In Line • Treatments Numbers Good • Got Much More Sample Size • Cost Less

  32. Measuring Intent to Participate and Participation in the 2010 Census and Their Correlates and Trends: Comparisons of RDD Telephone and Non-probability Sample Internet Survey Data Josh Pasek and Jon Krosnick

  33. Intent to Complete Census • Better Demographics Compositions • Intent Numbers Varied • Predictors for Intent to Complete Different • Trends Also Different

  34. Can a Non-Probability Sample Ever be Useful for Representing a Population?: Comparing Probability and Non-Probability Samples of Recent College Graduates Cliff Zukin, Jessica Godofsky, Carl Van Horn, Wendy Mansfield, and J. Micheal Dennis

  35. Comparing Sampling • Probability Samples have a Prob Theory • Can’t Intelligently Trade Off Error • Compare KN Panel to volunteer Panel • Recent Graduates

  36. Results • Differences between probability and non-probability panel • No mode effects or questionnaire effects • Differences mitigated a lot when weighting for other non-quota variables

More Related