1 / 31

Mixing Modes

Issues of Coverage, Sampling and Participation in Mixed Mode Surveys Peter Lynn, University of Essex 6 th ESRC Research Methods Festival Oxford, 08-07-2014. Mixing Modes. “mixing modes gives an opportunity to compensate for the weaknesses of each individual mode at affordable cost”

braith
Télécharger la présentation

Mixing Modes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Issues of Coverage, Sampling and Participation in Mixed Mode SurveysPeter Lynn, University of Essex6th ESRC Research Methods FestivalOxford, 08-07-2014

  2. Mixing Modes “mixing modes gives an opportunity to compensate for the weaknesses of each individual mode at affordable cost” - de Leeuw (2005) Involves an explicit trade-off between costs and (multiple sources of) survey errors Recent interest in mixed modes is particularly stimulated by the (possibly false) notion that the marginal cost of data collection by web is close to zero

  3. A Distinction • Multi-mode (or multiple mode) data collection: Different modes used for different survey items, e.g.- CASI component within a CAPI survey;- CATI follow-up to a mail questionnaire;- etc • Mixed mode data collection: The same survey items can be collected by different modes for different sample members:- sequential;- concurrent selective;- concurrent with respondent choice; etc

  4. Motivation for Mixing Modes Motivation for mixed modes rather than a single mode: • Cost reduction; or • Coverage/ participation enhancement. (Designs to achieve both simultaneously are proving elusive.) Different motivations tend to suggest rather different designs: • Combinations of modes; • Sequence of modes (or non-sequential). The coverage, sampling and participation issues may differ between these types of design

  5. Sequential and Concurrent Mixed Mode Designs Sequential design: • Use a number of modes in sequence; • Get as much response as possible in one mode, before trying remaining non-respondents in the next mode • Cheaper modes first if motivation is to reduce costs • Higher response rate modes first if motivation is to maximise participation Concurrent selective design: • Offer a different mode to each of 2 or more subsets of sample members Concurrent elective design: • Offer each sample member a choice of mode Combination: e.g. Concurrent selective sequential design

  6. Sequential Design: Example Phase 1: • Mail an invite to a web survey to all sample members; • Mail a reminder to those who have not responded after a week or two Phase 2: • Mail a paper self-completion q’reto those who have still not responded after a further week (or include this with the reminder above) Phase 3: • Approach for a face-to-face interview those who have still not responded after a further period Key design choices: Which modes? Which order? Criterion for switching to next phase?

  7. Coverage Issues with Mixed Mode Surveys I Mixed modes (of approach) can help address frame quality problems. Examples: • Inconsistent contact details on sampling frame: only address for some people, only email for others, etc • Dual-frame approaches: good coverage is provided only by the union of multiple frames, which have different contact details Note: Mixed modes of approach need not necessarily imply mixed mode data collection

  8. Inconsistent Contact Details: Example Dutch GPSs (e.g. Labour Force Survey, ESS experiment) • Select addresses from Postal Address Register • Match phone numbers to addresses (via names) • 70% match successfully: Can approach by phone • 30% do not match: First approach face-to-face • Some of the addresses with phone numbers also require face-to-face follow-up

  9. Dual-Frame: Example • RDD or list-assisted sampling to generate sample of phone numbers:- These numbers are screened to identify households (with phones) • Supplementary sample of addresses:- These addresses are screened (face-to-face) to identify households with no (landline) phone • The two samples combined give good coverage

  10. Coverage Issues with Mixed Mode Surveys II Coverage issues can introduce constraints on mixed mode data collection. Example: • Desire is a sequential web → face-to-face design • Not all sample members are web users • Non-web users must skip the web phase, either explicitly or implicitly

  11. Coverage Issues with Web as a Primary Mode A. Restrict survey to web users: • Obvious cost advantages; • Non-random under-coverage: requires evaluation and adjustment. Or B. Include non-web users in web mode: • Requires provision of hardware, software and training; • Various models, e.g. LISS, KnowledgePanel, GIP, ELIPSS. Or C. Include non-web users in a different mode…

  12. Coverage Issues with Web as a Primary Mode, ctd. Option C, including non-web users in a different mode: • May have cost advantages compared to providing equipment – depends on frequency/nature of data collection, etc; • May have measurement disadvantages (see next two presentations!) Various designs possible: • Web + mail, based on mail-only approach; • Interviewer-administered recruitment, followed by web+mail, web+phone, or other mixes; • More options are feasible in longitudinal context. Examples: GESIS panel and Gallup panel are both web+mail

  13. Sampling Issues with Single Mode Web No general population frames with email addresses, therefore first approach must be by a different mode. For frames without names, mail approach requires self-administered respondent selection: • This is error-prone if paper-based and may cause dropout if web-based. Interviewer-administration preferred but costly. Alternatively, use non-probability recruitment methods (opt-in panels) and model-based inferential paradigm.

  14. Sampling Issues with Mixed Mode including Web Frames with partial information can be used (e.g. email addresses for a subset) – but none yet in UK. Instead, single-mode initial approach needed: • Mail approach, with web+mail data collection, may offer low-cost solution of reasonable quality in some situations; • In UK, this design may work with named-person frames such as admin records, but is hampered by the need for respondent selection for general population surveys; • Interviewer approach and respondent selection may be preferable, but only cost-effective for longitudinal surveys.

  15. Respondent Selection Next / last birthday method: • European Social Survey (UK) Experiment (Villar, 2013) • Selection compared to birth date info from questionnaire; • Approx. 50% correct, 20% incorrect , 30% uncertain • (amongst households with 2+ adults) Household roster / grid method: • Community Life Survey (Williams 2013) • Approx. 25% incorrect selections → Difficult / impossible to control who completes a self-completion questionnaire; → Chance of incorrect respondent may be greater with web

  16. Participation in Mixed Mode Surveys Response rates in single-mode surveys: Face-to-face > telephone > self-completion – typically Amongst self-completion surveys: Mail > web – often, but not always Composition of response: Broadly similar between modes – typically But some differences

  17. Participation in Mixed Mode Surveys Population FTF Response Web Response

  18. Participation in Mixed Mode Surveys Population Mixed Modes Response?

  19. Increasing Response Rates • Success requires:- All (or most) people who would have responded in mode 1 continue to respond;- Additional people respond too (in mode 2) • But it is generally the case that:- (Average) response propensity declines FTF → Phone → Mail → Web- Data collection costs decline in same order- Refusal in one mode reduces propensity to respond in next mode

  20. Examples of Increasing Response Rates • Examples are (all?) from surveys that use a subset of the sequence FTF – Phone – Mail – Web • And even then, it is also necessary to “exhaust” each mode • E.g. 1 British Household Panel Survey: FTF → phone- Approx. 93% response FTF alone; 96% FTF + phone (amongst previous wave respondents)

  21. Examples of Increasing Response Rates • Examples are (all?) from surveys that use a subset of the sequence FTF – Phone – Mail – Web • And even then, it is also necessary to “exhaust” each mode • E.g. 1 British Household Panel Survey: FTF → phone- Approx. 93% response FTF alone; 96% FTF + phone (amongst previous wave respondents) • E.g. 2 British Crime Survey 2011 Follow-Up: Mail → web- 60% response mail alone; 63% mail + web (amongst BCS respondents)

  22. Examples of Failing to Increase Response Rates • All (?) surveys that use a sequence which is not a subset of FTF – Phone – Mail – Web • E.g. 1 UKHLS-IP 2009 wave 2: Phone → FTF- 76% response FTF alone; 67% Phone + FTF (w1 resps)

  23. Examples of Failing to Increase Response Rates • All (?) surveys that use a sequence which is not a subset of FTF – Phone – Mail – Web • E.g. 1 UKHLS-IP 2009 wave 2: Phone → FTF- 76% response FTF alone; 67% Phone + FTF (w1 resps) • E.g. 2 UKHLS-IP 2012 wave 5: Web → FTF - 84% response FTF alone; 79% Web + FTF (w4 resps)

  24. Examples of Failing to Increase Response Rates • All (?) surveys that use a sequence which is not a subset of FTF – Phone – Mail – Web • E.g. 1 UKHLS-IP 2009 wave 2: Phone → FTF- 76% response FTF alone; 67% Phone + FTF (w1 resps) • E.g. 2 UKHLS-IP 2012 wave 5: Web → FTF - 84% response FTF alone; 79% Web + FTF (w4 resps) • E.g. 3 UKHLS-IP 2012 wave 2: Web → FTF - 85% response FTF alone; 81% Web + FTF (w1 resps)

  25. Examples of Failing to Increase Response Rates • All (?) surveys that use a sequence which is not a subset of FTF – Phone – Mail – Web • E.g. 1 UKHLS-IP 2009 wave 2: Phone → FTF- 76% response FTF alone; 67% Phone + FTF (w1 resps) • E.g. 2 UKHLS-IP 2012 wave 5: Web → FTF - 84% response FTF alone; 79% Web + FTF (w4 resps) • E.g. 3 UKHLS-IP 2012 wave 2: Web → FTF - 85% response FTF alone; 81% Web + FTF (w1 resps) • E.g. 4 NL-ESS 2009: Web → Phone/FTF (new sample)- 52% response FTF alone; 46% Web + Phone/FTF

  26. Non-Response Bias • Most (of the few) studies to date either found no effect or a modest (assumed) positive effect • Hope is that web may disproportionately add young, full-time employed, busy people – who are generally under-represented in surveys • But very little evidence either way on this point:- NL-ESS 2009 found almost identical sample composition in the FTF-only and Web + Phone/FTF samples (age, employment, education, etc)- UKHLS 2012 too found no significant differences between FTF-only and Web+FTF (age, gender, household type, etc)

  27. Mixed mode including Web: UK Examples Two cross-sectional surveys: Community Life Survey (Williams 2013): • Random sample of addresses from PAF • Advance letter → Mail invitation → Mail reminder → Mail questionnaire • 16% responded online + 11% mail = 27% response (no incentive) • 19% responded online + 12% mail = 31% response (£5 conditional) • 22% responded online + 13% mail = 35% response (£10 conditional) • 25% responded online + 14% mail = 39% response (£5 unconditional) European Social Survey (UK) Experiment (Villar, 2013): • Random sample of addresses from PAF • Advance letter → Mail invitation → Mail reminder → Face-to-face fieldwork • 21% responded online + 18% face-to-face = 39% response

  28. Mixed mode including Web: Longitudinal Examples British Crime Survey re-contact study (Fong & Williams 2011): • Issued sample = 30% of BCS respondents who gave an email address; • Email invitation → Email reminder → Postal q’re → Postal reminder • 35% responded online + 27% by mail = 62% response

  29. Web+Interviews: Examples 1958 Birth Cohort (Brown et al 2014): • 9th wave (age 55) – 5 years after previous wave • Sequential web → telephone • 62% responded online + 21% phone = 83% response Understanding Society (Jäckle et al, 2013): • Issued sample = wave 5/wave 2 of a household panel • Mail(+email) invitation → (email reminders) → Mail reminder → Face-to-face • 21% responded online + 55% face-to-face = 76% response (w5 sample) • 30% responded online + 50% face-to-face = 80% response (w2 sample; higher incentives) • 23% responded online + 55% face-to-face = 78% response (w5 sample, £10) • 23% responded online + 51% face-to-face = 74% response (w2 sample, £10)

  30. Conclusions Potential cost savings from mixed mode may erode if we: • Aim for full population coverage • Aim for response at least as high as could be achieved with single-mode Participation advantages of mixed mode likely to be costly Most promising cost-quality trade-off in the UK currently may be: • Web → Mail → Face-to-face But … measurement concerns (next presentations!)

  31. Issues of Coverage, Sampling and Participation in Mixed Mode SurveysPeter Lynn, University of Essex6th ESRC Research Methods FestivalOxford, 08-07-2014

More Related