1 / 40

The next challenge

The next challenge. Efficient and Effective Mixed- and multi-mode research Tim Macer, meaning limited, London, UK Presented at the Dutch Market Research Association Annual Conference, Rotterdam, Netherlands 6 & 7 November 2003. Agenda. The Rise of Multiple modes The Issues

dotty
Télécharger la présentation

The next challenge

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The next challenge Efficient and Effective Mixed- and multi-mode research Tim Macer, meaning limited, London, UK Presented at the Dutch Market Research Association Annual Conference, Rotterdam, Netherlands6 & 7 November 2003

  2. Agenda • The Rise of Multiple modes • The Issues • Technical framework • Survival guide

  3. 1. The Rise of Multiple Modes

  4. Time line 1975 1980 1985 1990 1995 2000 CAWI Evolution of today’s survey modes WAP Technology independent Technology based CASI Disk by mail CAPI MCAPI CATI TCASI (IVR) Telephone Face-to-face OMR scanning OCR scanning

  5. The rise of multiple modes • In USA, Web surveys are the undisputed replacement for paper-based mail surveys* • Response rates falling • ‘One size fits all’ model does not work in international research • Case studies showing that mixing modes can • Achieve a better response • Remain scientifically valid *Source: RS Owen in Quirk’s magazine, Feb 2002, p.24-26

  6. LEVEL OF DIFFICULTY What do we mean by multi-mode? • Multi-mode • Surveys utilizing more than one research channel to reach different sub-samples, but confining each sub-sample to one channel • Mixed mode • Serial • Surveys that involve successive interviewing stages, each utilizing a different mode • Parallel • Surveys that allows participants to choose the mode and even to switch modes

  7. Mixing Modes: some examples • Multi-country studies • Web in USA • CATI in EU countries • CAPI or paper in India • Let respondent choose • Contact by phone • Continue by phone or web In parallel In serial

  8. The multi-mode bandwagon Modes supported Product choice (42 packages) Source: Research Guide to Software 2003

  9. Multi-mode: the challenge “Survey organizations, whether they are in universities like mine, in private-sector organizations or in government organizations, are going to have to change dramatically in some ways in order to do effective surveys as we bring these new technologies online and still use our other technologies where they work.” Don Dillman, Washington State University

  10. 2. The issues What are the problems? How can these be resolved?

  11. The three types of modal issues • Calibration • The risk of differential measurement error due to modal effect on the respondent • Coverage • Sampling issues—risk of differential non-response from sub-samples for each mode • Complexity • Duplication of operational and programming effort in addressing more than one mode • Increased cost, delays and errors from this duplication

  12. Calibration issues • Don Dillman • Total Design Method in 1978 to achieve consistency between phone and mail surveys • Revised in 1999 to take into account Internet surveys • Examined response rate measurement differences in experimental trials • Dillman’s conclusions • There are observable and systematic differences • Disadvantages outweighed by overall improvement in sample coverage, response, time and cost Source: Dillman et al, paper at AAPOR Conference, Montreal, 2001

  13. Modal influence: calibration or coverage? • Oosterveld and Willems • Another experimental research design mixed CATI/Web surveys • Aimed to separate modal effect from population effect Source: Paper at ESOMAR Technovate, Cannes, 2003

  14. Modal influence: calibration or coverage? • Oosterveld and Willems • Another experimental research design mixed CATI/Web surveys • Research design separated modal effect from population effect • Their conclusions • The majority of differences reported in previous studies between Web and paper can be explained by population difference, not intrinsic modal effects • Mixed mode studies can be designed to have no influence on the answers Source: Paper at ESOMAR Technovate, Cannes, 2003

  15. Mode switching to improve coverage • Allison & O’Konis • Mixed Web/CATI survey of online financial services • Initial approach by CATI or Web with option to switch • 88% of CATI respondents agreed to a continue their interview on the web • 54% of them went on to complete • Different modes gave highly similar responses • Their conclusions • Switching modes does increase response rate • But, provided that the switch is done immediately: tomorrow is too late Source: Quirk’s magazine, July/Aug 2002, p20

  16. Modal influences observed • Presentational influences • Ganassali and Moscarola have measured increased responses when relevant visual clues presented in web interviews

  17. Using a mobile phone whilst driving: claimed level of usage Rarely Sometimes Often Online With interviewer Modal influences observed • The moderating effect of the interviewer • Noted by Poynter and Comely amongst others • Can lead to under-reporting, especially of socially unacceptable responses After: Poynter & Comely, Beyond Online Panels, ESOMAR Technovate 2003

  18. Open-ended responses • Oosterveld and Willems • Observed longer and more detailed verbatim response on the web than phone • Allison and O’Konis • Observed great similarity for for phone and web • However, population was one with high internet penetration • Noted some content differences e.g on ‘technographic’ subjects which they attributed to population effect

  19. Scale questions • Humphrey Taylor (2000) • Observed a tendency for respondents to answer scale questions differently on the web • Dillman et al (2001) • Characterised differences between CATI and CAWI on anchored scale questions (1=strongly agree etc) • CATI respondents favors the extremes • CAWI significantly more likely to use the entire scale • Bäckström and Nilsson (2003) • Observed the same tendency between self completion on paper and web • More research required

  20. Differences in ‘don’t knows’ • Hogg • More answers recorded as ‘Don’t know’ or ‘No answer’ in Web surveys than same survey when interviewer-led in CATI • Recommends omitting explicit DK/NA categories in version displayed on the Internet Source: Quirk’s magazine, July/Aug 2002, p90

  21. Population effects • Non-response (non-participation) • Don Dillman and others observed greater tendency for males not to participate in CATI and females in Web surveys • Population effects are also influential in… • Open-ended responses • Rating scales • Possibly more (Oosterveld & Willems)

  22. Operational complexity issues • Different recruitment and screening • Can’t always approach by same mode • Duplication of the survey instrument • Complete duplication of effort may be required • Problems managing multiple versions • Data Handling • Need data in one place in one format • Problems mixing online and offline modes • Mode switching • Must be fast if response rate to be improved • Mode-appropriate texts

  23. 3. Technical framework How should technology be supporting mixed mode research? What are the software developers doing to provide this support?

  24. Framework for the ideal MM system • Common survey authoring tool across all modes • Independence of design and execution • Mode specific texts (not through foreign languages) • One common, central database for all modes • Auto-determine contact mode from sample • Efficient mode switching • Concealment of previous data when switching to self-com. • Reminders and auto-revert to previous mode • Single view management & reporting tools across all modes • Quotas that operate across all modes • Question constructs that recognise different modes • Recording of mode at datum not case level

  25. Suppliers contacted Askia Askia Mercator snap MI Pro MI Pro Research Studio Nebu Dub Interviewer Opinion One CAVI Pulse Train Bellview Fusion Sphinx Sphinx SPSS MR Dimensions

  26. Who supports what?

  27. The issues—according to the developers

  28. Innovation: Calibration issues • Reduction of modal influence • Opinion One CAVI • Totally consistent appearance for Web, CASI & CAPI • Novel method for unaided questions in self-completion modes • Sphinx • Experimental approach • Measurement of modal differences • Pulse Train • collect paradata on mode for each question

  29. Innovation: Complexity issues • Modal independent design • SPSS MR • Modal “players” • Askia, MI Pro, Pulse Train, Nebu, SPSS MR • Modal templates applied to same survey instrument • Central database • All apart from snap • Wizards for importing offline data in Askia

  30. Innovation: Complexity issues • Mode switching • Handled well in Askia, Pulse Train, Nebu and Opinion One • Email despatched automatically in Opinion One • Nebu recognises ‘static’ and ‘dynamic’ swaps • Call me button in Pulse Train linked to dialler • Recall of interviews into CATI mode in Askia, Nebu, Pulse Train • Switching in and out of paper in MI Pro

  31. Missing features • Ability to cross-tab data by mode at a datum level • Support for systematic removal of answers from modes, • i.e. Don’t Know and Not Stated from self-completion • Up-stream sample management • Support to simplify parallel screening • Developers need to focus more on the calibration and coverage issues!

  32. 4. Mixed mode survival guide

  33. Metadata standards can help • MR slow to embrace standards to allow easy data transfer from system to system • Most focus on the interchange of collected data, not survey instruments • Standards allows the metadata to be transferred along with the data • Examples of metadata include: • Question type • Unique question name • Question texts and answer texts/codes • Permitted ranges of values • Routing or filtering context

  34. Triple-s www.triple-s.org • First published 1994 • Originated in the UK but now implemented by 30 vendors worldwide • Exchange data and metadata via exports and imports in a generalized format • Version 1.1 introduced XML support • New version 1.2 adds filters, weighting and multi-language support • No metadata support for survey filtering or routing logic

  35. SPSS Dimensions Data Model • A new open (though proprietary) metadata model for survey data • Can be licensed independently of all SPSS MR products (don’t have to use SPSS software) • Comes with a developers’ library of tools for building applications that will read or write data via the SPSS Data Model • Many other software companies now providing support for the SPSS Data Model • Metadata for survey data not survey routing and logic

  36. QEDML www.philology.com.au • New multi-platform survey authoring tool • Exports scripting languages for several packages, including Quancept, Surveycraft and In2form • XML based open system, allows other language translators to be added

  37. Tips for multi-mode survey design • Design your survey to be as mode neutral as possible • Pay attention to rating scales • Consider exclusion of Don’t know/Not stated answers on self-completion modes • Ensure you can identify the mode when analysing your data, at each question • Standardise on the software, or at least, the data format

  38. In summary • Modal differences do exist, but can be overcome with careful design • Issues relate to: Calibration, Coverage and Complexity • Common survey authoring and a common results database improve MM efficiency • Software manufacturers are largely focusing resolving complexity issues • Better standards, especially for survey instrument metadata, are needed

  39. Bibliography Allison J & O’Konis C (2002) If Given the Choice, Quirk’s Marketing Research Review, July/August issue, p 20. Bäckström, C & Nilsson, C (2002) Mixed mode: Handling method differences between paper and web questionnaires, http://gathering.itm.mh.se/modsurvey/pdf/MixedMode-MethodDiff.pdf Dillman D A (1978) Mail and Telephone Surveys: The Total Design Method, Wiley Dillman D A, Phelps G, Tortora R, Swift K, Kohrell J & Berck J (2001) Response Rate Measurement Differences in Mixed Mode Surveys Using Mail, Telephone, Interactive Voice Response and the Internet, AAPOR Annual Conference, Montreal Ganassali S & Moscarola J (2002) Protocoles d’enquête et efficacité des sondages par Internet, Journées E-Marketing AFM/AIM Conference, Nantes, France Macer, T (2003) Research Software Review, The Market Research Society, London. Oosterveld, P & Williams P (2003) Two Modalities, One Answer. ESOMAR Technovate Conference, Cannes. Owen R S (2002) A Matter of Trade-offs: Examining the advantages and disadvantages of online surveys, Quirk.s Marketing Research Review, February, pp 24-26. Poynter Rand Comely P (2003) Beyond Online Panels. ESOMAR Technovate Conference, Cannes Taylor H (2000) Does Internet Research Work? Comparing online survey results with telephone survey, International Journal of the Market Research Society, 42.1

  40. www.meaning.uk.com

More Related