1 / 23

Burden and Loss: The Role of Panel Survey Recordkeeping in Self-report Quality and Nonresponse

Burden and Loss: The Role of Panel Survey Recordkeeping in Self-report Quality and Nonresponse. ITSEW 2010 Ryan Hubbard and Brad Edwards. Record Usage, Perceived Respondent Burden, and Measurement.

padma
Télécharger la présentation

Burden and Loss: The Role of Panel Survey Recordkeeping in Self-report Quality and Nonresponse

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Burden and Loss: The Role of Panel Survey Recordkeeping in Self-report Quality and Nonresponse ITSEW 2010 Ryan Hubbard and Brad Edwards

  2. Record Usage, Perceived Respondent Burden, and Measurement • Record use may increase perceived respondent burden. Respondents experiencing higher levels of perceived burden may be more likely to drop out of the study. • The regular use of records should improve event measurement and subsequent matching to administrative data. • Record usage may disproportionately affect certain subsamples.

  3. Background

  4. Medicare Current Beneficiary Survey • Longitudinal Study for Centers for Medicare and Medicaid Services currently in Round 58 • 16,000 Medicare beneficiaries in a rotating panel design • 12 round study including baseline and exit interview over 4 years • CAPI interview on health, health care utilization and health care expenditures designed to augment Medicare administrative data on events and payments • Matches costs to health care events in an effort to enumerate payers and payments. • Relies on respondent collection and interviewer entry of medical insurance statements from multiple sources • Asks respondents to voluntarily track health care utilization on a provided calendar.

  5. Modeling Error in MCBS Record Usage

  6. Simplified Model Increase Panel Attrition Perceived Burden Non-response Event Calendar Ratio of Event Costs Covered by a Statement Event and Cost Data Differing from Admin Records Record Usage Measurement Error Non-Reporting of Events in Admin Records Decrease

  7. Predicting Attrition in Later Waves

  8. Summary of Attrition Results Higher levels of statement usage reduces the likelihood for refusal-based attrition. This relationship is stronger than any other in the model. Greater use of the calendar, controlling for other factors, reduces the likelihood for refusal-based attrition. Statement usage, because it is relatively passive, and calendar usage, because it is voluntary and may improve interview flow may actually dissuade attrition through a reduction in perceived burden. The effects we see actually represent unmeasured motivations such as topic salience and altruism. Those invested in the study are more likely to comply with requests for record keeping and complete all rounds of the study.

  9. Simplified Model Panel Attrition Perceived Burden Non-response Event Calendar Ratio of Event Costs Covered by a Statment Event and Cost Data Differing from Admin Records Record Usage Measurement Error Non-Reporting of Events in Admin Records

  10. Research Questions Does statement usage reduce event measurement error? Does statement usage reduce cost measurement error? Are there other factors related to the interview and the respondent that outweigh the effects of record usage?

  11. Analytical Approach • Analysis uses the two most recently completed panels in the study. • Explores differences in reporting for respondents who provide insurance statement records vs those who do not. • Examines the role of other factors such as demographics and paradata in this process • Final goal is to examine match rates for two groups with regard to: • Event reporting • Cost reporting

  12. Preliminary Analyses and Findings

  13. Differences in Event Reporting: Full Sample R2 = .14

  14. Full Sample: Full Regression R2 = .24 R2 = .36

  15. Differences in Event Reporting: High Event Sample (50+ Events) R2 = .07

  16. High Event Sample: Full Regression R2 = .12 R2 = .21

  17. Modeling the Match

  18. Forthcoming Data Events reported through survey data are matched with CMS administrative data (when available). Events are combined to form a comprehensive reporting of medical events for each sample person. Complex algorithm used for this process that results in a source code for each event. Source code (survey, admin, or both), event type, and event date can be matched back to sample person to help assess measurement error.

  19. History of Matching on the MCBS • Goal of match to produce data more complete than either survey or billing records (Eppig and Chulis,1996). • Administrative records are not the gold standard but provide good evidence: • That a health service occurred • That Medicare was a payer • The amount paid by Medicare • Survey data provide: • A good record of out-of-pocket and third party payments • Record of events/services not covered by or submitted to Medicare

  20. Matching Details from an Earlier Panel • 2008 match data not yet available • For all medical practitioner, inpatient and outpatient events in 2006 a raw match indicates: • 30% of events come from survey data only • 43% of events come from administrative data only • 27% of events match (can be found in both) • Inpatient events: 55% match • Medical practitioner events: 26% match • Outpatient events: 26% match • Dental Visits: 0.4% match • Not a 1 to 1 correspondence

  21. Workshop Issues • Given the rich administrative data to be available, we are interested in advice regarding the best analytical approach for assessing measurement error in: • Event reporting • Event detail reporting • Cost reporting • How do we best utilize administrative data that is a good source for matching a subset of survey events

  22. Ryan A. HubbardRyanHubbard@westat.com Brad Edwards BradEdwards@westat.com

  23. Logistic Regression Results: Full Model 23

More Related