New Techniques and Technologies: Best Practices and the Future of Online Research
Agenda • Overview of Research on the Web • State of the Online Research Industry • Evolving Online Populations • Dealing with the Non-response Crisis inResearch • Salience Web Cooperation and Satisfaction • Best Practices in Online Research Design • Techniquesand Methods • Historical Technological Perspective • New Techniques • Best Practices in Online Research Recruitment and Sampling
Evolution of Web-based Surveying • The research industry in the U.S. began actively working with Web-based surveying around 1995 • In the early years, online research was met with skepticism from academic, business and traditional research agencies • Key Concerns: • Web penetration was too low to be “representative” and “projectable” • Identity of online participants was difficult to “verify” • Significant minority communities of the population were not “online” • Ethnic Groups • Low Income • Low Education • Fear of changing from “traditional” methods
Benefits Now Outweighing Concerns • Over time, many of the issues causing concern have been either corrected or determined to be less important • Numerous studies have shown that Web-based surveys (when done with similar demographic groups) yield virtually the same results as traditional methods • Other benefits have superceded procedural concerns • Cost • Speed • Cooperation Rates (>45%) • Quality of Data • At the same time, the response rates to traditional requests have fallen sharply • Random telephone response rates (<12%) • Cooperation rates (<40%)
Model of U.S. Adoption of Web Research Large number of U.S. companies projecting more than 60% of research to be switched from “traditional” to online
24% Growth in U.S. Web Research Revenue • Web-based research is expected to grow at an average of 30% per annum over the next two years to 45% of U.S. research revenue in 2005 Source: Inside Research, March 2002
Online Research Economics <4 Hours Productivity enhancementsdrive down costs and time-to-field. $32/hour
n=575 Web = ½ CATI Online Research Economics • Most traditional research costs are driven by marginal, variable costs: Interviewers • Online research costs are mostly fixed, up-front charges with low variable costs For larger sample sizes,Web can be half the costof phone.
Online Research Industry Profile • Three Major Trends: • Diffusion of Technology • Consolidation Absorption • Hyper-competition Industry profits are derived almost exclusively from marginal productivity gains.
Growth in Size of Internet • By year end 2002, 513 million people around the world will have Internet access; this number will approach 1 billion by 2005 (eTForecasts) • The top 15 countries will account for nearly 82 percent of Internet users • 79.4 per 1,000 people worldwide 2002 • 118 people per 1,000 by year-end 2005
Number of Internet Users by Region September 2002 Source: http://www.nua.ie/surveys/how_many_online/index.html
Over Represented Under Represented Under Represented Under Represented Barriers to Internet Use • Lack of Local Content • While information and content from multi-national sources may be useful, Internet users may also desire local information • Literacy • In nations where the literacy rate is low, a text-heavy Web has little use Language 1:http://www.glreach.com/globstats/, 2:http://wcp.oclc.org/
Corporate Cultures & Web Research • Separate from simple Web penetration figures, some national business cultures cause companies to resist the adoption of the Internet as a research vehicle • The U.S. has moved beyond most of the early objections to Web adoption and is converting large proportions of traditional research to the Web (e.g., General Mills, Kraft Foods) and experimenting with rich media formats • Continental European and South American companies in particular tend to be less willing to shift research projects to the Web • Resistance is strongest in countries with a national business culture built on personal contact and face-to-face meetings (Italy, Spain, Brazil)
U.S. Online Populations Now Similar to Offline • U.S. Online Populations are beginning to match broader population statistics • Consumer Demographics Similar • Gender • Race/Origin • Age Groups (<50) • Geographic Dispersion • Household Type • Business Firmographics Similar • 86% Web penetration • Equal distribution across major SIC categories • E-mail/Web preferred method of communication by Business Decision Makers Recommendation: http://www.ntia.doc.gov/ntiahome/dn/index.html
Web Still Has Important Differences • Psychographic Differences between online and offline respondents affect survey results • People appear to be “in a different place” mentally when responding to Web surveys • Online users’ attitudes change rapidly through experience and gain media-based savvy more quickly than other communication modes
Psychographic Differences • There are some differences between data collected on line compared to interviewer administered methods • e-Personality • Aggressive Behavior • Stronger Opinion Positions • Intense Candor • More Participatory • Cynicism Web vs. Telephone e-Personality Effect Pattern
Anonymity Gradient E-Personality Driven by Anonymity Gradient Less Human Interaction =Greater Candor • Effect is intensifiedin Asian cultures • Effect is less pronouncedin No. Europe
Comparing “Threats to Validity”Phone vs. Web • Extraneous Factors No difference • Changes in Subjects Web may be better (speed) • Measurement Changes Web better (<Interview bias) • Subject Guessing No difference • Equivalent Groups No difference • Dropout Rate Over Time Phone may be better (email address perishability) Internal Threats (Design and Implementation) Burns, Bush 2003, pg 137
Comparing “Threats to Validity”Phone vs. Web • Representativeness of Sample No difference • Realism Web may be better (visual) • Gerneralizability No difference External Threats (Projectability) Burns, Bush 2003, pg 137
Implications of Web Drawbacks • Phone may be better if… • Answers are likely to be broad ranging and/or unclear (need to probe) • Survey requires “persuasion” to get a full set of answers (need to coax responses) • Audience is not likely to be online (Web access is not representative of study population) • Rule of thumb If 60% of the population is not “online” use alternative method or hybrid approach • All other areas, Web is generally as good as or better than traditional phone
Types of Online Research • Proportion of study types from top 27 Online MR Firms Inside Research, Jan. 2003
Web Candor = Data Quality • “Self-reported data” collected via the Web more closely approximates known behavioral data Case Study: CATI – consistentlyunderstated problemsoverstated satisfaction Web – much closer to known behavioral data
Web Candor = Data Quality • Most Purchase Intent and Projected Behavior data must be “deflated” for overly optimistic estimates CATI Purchase Likelihood Deflator Proportion Stated Purchase LikelihoodStatingDeflatorProduct Definitely Would (5) 25% 75% 19% Probably Would (4) 30% 50% 15% Might or Might Not (3) 22% 35% 8% Probably Would Not (2) 10% 25% 3% Definitely Would Not (1) 13% 10% 1% Total Projected Actual Purchases46% Proportion where Stated = Actual CATI Web
Rapid Evolution of Sophistication • Online population “matures” faster than offline • Offline customer tastes and demands can take years to change • Brand positioning, messaging, and effective promotional strategies can last for decades • Online customer tastes change rapidly as a by-product of experience • Levels of "satisfaction" with an Online process can change in a matter of months • Large differences between B2B and B2C customer rate ofevolution
Rapid Evolution of Sophistication • Model of customer evolution Results from21 studies 1995 through 2001
Online Usage Behavioral Profile Proportions FastAdopters Utilitarians Abandoners
1. Don’t look like spam! • #1 reason that people do not respond to email invitations is "I thought it was spam“ • Send customized text messages, not HTML • Use the full, correct email address for each invitation. Do NOT use BCC or bulk mailing options • Familiarity with the sender is key: don't constantly change domain names from which invites are sent • Avoid domain name elements that "spam filter rules" will eliminate: • Offer, Free, Blast, Private, Bargain, Discounts, Daily, Deals, Promo, Win/Winner, Shop, Lotto, Marketing, Rewards, Wholesale, Unique, Thrifty, Value, Direct, Buy Don’t BEspam
2. Message Line: “Sponsor” “Topic” "Survey" • Dealing with a "known and trusted" source is also a key to getting your email invitation opened • Trial-and-error has shown that a combination of "source and topic" is important • Coupling the source and topic with the specific request word: "survey" usually produces the best effect
3. First Statement • Once opened, you have between only 2and 5 seconds worth of reading time to get the person's decision to toss or to participate • (Compare this to 7 seconds of a spoken telephone solicitation) • About one sentence's worth of content • Our recommendation: Distance research from direct sales/spam • Recommended wording: • "This invitation to take part in research and is not a sales solicitation."
4. Salience Points • Different factors will act most strongly to convince someone whether or not to participate. • These factors are difficult to predict because they differ by individual and by study over time • Suggest a bullet-point list of all four key salience drivers: • What are we researching? • How much are we offering for their time? • Who is the sponsor (or affiliated industry)? • How much time/effort will this take?
5. Non-Qualified Terminates Recognition • If your sample must be screened for demographic or other characteristics, give an indication of what will happen if they don't qualify • People who terminate without recognition of their effort are less likely to participate in future research • Recommended wording: • "If you do not qualify to participate in the entire survey, you will still be entered into a drawing to win $XXX just for trying."
6. Contact for Help/More Information • Today's climate of spam and "sifting" has created respondent anxiety about "legitimacy" • Respondents do not trust "third party" researchers as much as they used to (particularly in B2B) • Need to provide a "contactable" human being live or online, who can assure potential participants that the study is authorized and legitimate • Include an email (alias) for online contact and a toll-free number for "questions you may have" about the study • Recommended wording: • "Please contact antonio.sanchez@XYZ.com, our member services manager, if you have any questions and reference project number 123-1234. Antonio also can be reached at 800-555-1234."
9. Opt-out email address and 800 # • Increasingly, state laws are requiring an opt-out mechanism for "list removal" • 14 states require a toll-free number in addition to an email reply system • $5,000-per-incident fine for not providing this information • If complaints arise, spam filter policy makers also check to see if email/telephone numbers work • Domains that do not comply will often be blocked at the ISP level for multiple systems • Recommended wording: • "If you would like to be removed from our contact list, please reply to this email and type 'Remove' in the subject line, or call 1-800-555-1234 and reference project number 123-1234."
10. Industry affiliations and ethics standards • If you are a member of a national or international industry group, publish • the name (with a hyperlink) in your text-based email invitation and • the logo on your privacy page and on the first page of your Web survey • If you use a hyperlink within your live survey, be sure it opens the link in a new window • Recommended wording: • "XYZ company is a member of IMRO, the Interactive Marketing Research Organization, and we subscribe to the privacy policies and code of research ethics published by that group."
Incentivization Programs • Important to match survey types with appropriate levels of compensation • Examples… • Cash Rewards (U.S. "Most Preferred") • Drawings and Prizes • Frequent Flyer Points • Gift Certificates • Amazon.com • McDonalds • Borders, etc. • Europe • Highest Response for Shared Information
Satisfaction by Incentive Top two box satisfaction ratings increase significantly when an incentive is offered. IMRO Survey Satisfaction Research: Respondent Satisfaction Modeling, February 2003
Satisfaction by Incentive Type Satisfaction ratings are significantly higher among those who were paid for their participation. IMRO Survey Satisfaction Research: Respondent Satisfaction Modeling, February 2003
Satisfaction by “Compensation” As mirrored in other studies, “adequate compensation” has a broad “mid level” at which satisfaction is not radically increased by increased awards. At the high and low ends of compensation, satisfaction is more significantly impacted. IMRO Survey Satisfaction Research: Respondent Satisfaction Modeling, February 2003
Variables Influencing Dropout Rates in Web-based Surveys • Design of surveys must take into account the interaction between burden and personal return variables…to enhance salience • length of survey (both in terms of time to complete and number of questions) • incentive (either total incentive offered as a prize package or the approximate value of the incentive on an individual basis) • engagement level • A combination of these factors influences the number and proportion of mid-survey abandoners (mid-terms)
Background • Findings from 19 Web-based studies • All of the studies were with business-to-business, technology-related decision makers • Surveys included U.S., European and Asian respondents (all surveys conducted in English) • The total number of respondents included in these surveys = 21,867 • Median sample size = 473