1 / 20

Item Nonresponse in a Mail Survey of Young Adults

Item Nonresponse in a Mail Survey of Young Adults. Luciano Viera, Jr., Scott Turner, and Sean Marsh Fors Marsh Group LLC. Overview. Background RDD Telephone Surveys: Coverage Implications Youth Poll Mail Study Purpose and Methodology Mail YP Frame Coverage

nyx
Télécharger la présentation

Item Nonresponse in a Mail Survey of Young Adults

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Item Nonresponse in a Mail Survey of Young Adults Luciano Viera, Jr., Scott Turner, and Sean Marsh Fors Marsh Group LLC

  2. 2011 ITSEW – Québec, Canada Overview • Background • RDD Telephone Surveys: Coverage Implications • Youth Poll Mail Study • Purpose and Methodology • Mail YP Frame Coverage • Data Collection Efficiency Comparison • Profile of Responders • Survey Estimate Comparison • Item Nonresponse • Incentive Experiment • Methodology Transition

  3. 2011 ITSEW – Québec, Canada Background • The US Department of Defense (DoD) conducts the Youth Poll (YP) to track military propensity • Tracked over the past 35 years • Critical to maintain these trend lines • Random-digit-dial (RDD) survey design • Historically, estimates have been shown to be reliable and valid • For decades, RDD telephone surveying has been a cost-efficient way to survey the general public • RDD surveys are typically interviewer-administered • Emerging trends are impacting the future viability of this methodology • Decreasing coverage of telephone surveys • General decrease in survey research response rates • Reduced efficiency due to coverage and nonresponse issues

  4. 2011 ITSEW – Québec, Canada RDD Telephone Surveys: Coverage Implications • CDC study found 46% of 18- to 24-year-olds live in wireless-only households • Households with a landline are different from those without a landline • Wireless-only households are more likely to be located in urban, metropolitan areas Percentage of Adults and Children with Only Wireless or No Telephone Service Children with wireless service only Adults with wireless service only Children with no telephone service Adults with no telephone service Source: Blumberg & Luke (2011)

  5. 2011 ITSEW – Québec, Canada Purpose and Methodology • The Youth Poll Mail Study (YPMS) conducted a series of data collection mode comparisons • Goal is to evaluate the feasibility of switching the current RDD telephone–based Youth Poll to a mail-based survey methodology • Run concurrently with RDD YP to compare coverage, nonresponse, and key metrics • December 2008 • June 2009 • December 2009 • June 2010

  6. Youth Poll Mail Study

  7. 2011 ITSEW – Québec, Canada Mail YP Frame Coverage • June 2010 frame was an updated version of December 2009 frame: • Aged to remove any youth that were no longer 24 years old • Augmented using additional lists to capture youth that had just turned 16 • December 2009 frame coverage • Comparison with Census estimates of the 16- to 24-year-old population indicated a 95% coverage rate, a 6 percentage point improvement from June 2009 (89%)* • Coverage of both 16- (67%) and 24-year-old (95%) youth increased substantially since June 2009 (47% and 60%, respectively) • Coverage of the West region also improved since June 2009

  8. 2011 ITSEW – Québec, Canada Data Collection Efficiency • The Mail YP required fewer contacts to reach the target population than the RDD YP and did so at less than one-third the cost per completed interview with no additional fielding time required • The Mail YP’s cost per completed survey in June 2010 ($77) was lower than in December 2009 ($99) and roughly equal to that in June 2009 ($76) • Increase in costs in December 2009 was largely a result of evaluating a push-to-Web solicitation strategy that was considerably less productive

  9. 2011 ITSEW – Québec, Canada Profile of Responders • Demographics • Compared with the RDD YP and census estimates, respondents in the Mail YP were more educated • Telephone Status • Compared with the RDD YP, respondents in Mail YP were: • More likely to have a cell phone only • More likely to be cell-mostly (i.e., both a landline and a cell phone who receive all or almost all of their calls on a cell phone) • Less likely to have a landline only

  10. 2011 ITSEW – Québec, Canada Survey Estimate Comparisons • Propensity estimates were statistically identical between the Mail and RDD YP • True for both overall and Service-specific propensity comparisons • Mail YP tends to provide slightly lower propensity estimates than RDD YP • Finding is consistent with expectations

  11. Item Nonresponse

  12. 2011 ITSEW – Québec, Canada June 2010 Mail YP Propensity Items

  13. 2011 ITSEW – Québec, Canada Item Nonresponse • In the June 2010 Mail YP, Service-specific propensity grid items yielded overall missing data rate of 8% • Non-random forms of nonresponse may bias point estimates • Analyzed June 2010 Mail YP data to determine: • Whether there was a pattern to the missing data; and • How refusals should be handled (e.g., multiple imputation, etc.) • Pattern of nonresponse was NOT RANDOM! • Non-propensed youth were more likely to refuse answering all 12 Service-specific items • Conversely, propensed youth more likely to refuse to answer 1-11 Service-specific items, which is probably a result of their preference for a specific Service(s).

  14. 2011 ITSEW – Québec, Canada December 2010 Administration • Modified the Service-specific propensity item in the December 2010 questionnaire: • Proportion of missing data reduced by half! • From June 2010 , 8% missing data rates down to 4%

  15. Incentive Experiment

  16. AAPOR 2011 – Phoenix, AZ Combining Prepaid and Promised Incentives • Mail YP • Periodically conducts experiments to enhance survey quality • Combining prepaid and promised incentives • Ideally, this would maximize survey quality at a reduced cost • Theoretical justification • Trust is the centerpiece of social influence theory (i.e., enhancing trust facilitates the persuasion process) • Prepaid incentives may “build trust” such that they might “magnify” the positive effects of promised incentives

  17. AAPOR 2011 – Phoenix, AZ Methodology • June 2010 Experiment • Sample of 30,000 young adults ages 16-24 living in US were randomly assigned to 1-of-6 conditions (5,000 each) • Survey length (long vs. short) • Promised incentive ($0 vs. $5 vs. $10) • Everyone received $2 cash incentive in the 1st survey invitation package • Promised incentives sent to respondents that returned questionnaires • Where possible, all mailing materials sent to youth were identical

  18. AAPOR 2011 – Phoenix, AZ Results • Nonresponse • Completion rates for the short survey were higher than for long survey • Promised incentives help offset completion rate drops in long survey • Key Metric Measurement • In general, point estimates similar across all the experimental conditions • Efficiency • Compared with offering just a prepaid incentive, promised incentive with long survey reduces costs by approximately 18-26% per completed survey Note: All estimates in this table are unweighted; 5,000 cases sampled in each condition.

  19. Methodology Transition

  20. 2011 ITSEW – Québec, Canada Summary • Several indicators supporting the switch to a mail-based methodology • Mail YP’s existing frame coverage (95%) of the target population represented a marked improvement over existing RDD methodologies that do not capture the steadily growing number of cell phone–only young adults • Metrics compared well across survey modes • Both overall and Service-specific propensity estimates were statistically identical between the Mail and RDD YP • Mail YP required fewer contacts than RDD YP to reach the target population • Mail YP was less than one-third the cost per completed interview • Mail YP required no additional fielding time • Mail YP is proving to be a viable vehicle for motivating higher response rates • Compared with offering no contingent incentive, including a contingent incentive reduces data collection costs by approximately 18-26% per completed survey* *Figures are based on a comparison of the cost per completed survey in the $5 contingent incentive condition ($63,745 survey cost ÷ 749 completed surveys = $85.11 per completed survey) and the $10 contingent incentive condition ($68,980 survey cost ÷ 898 completed surveys = $76.82 per completed survey) vs. the $0 contingent incentive condition ($60,000 survey cost ÷ 575 completed surveys = $104.35 per completed survey).

More Related