Teach Epidemiology - PowerPoint PPT Presentation

slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
Teach Epidemiology PowerPoint Presentation
Download Presentation
Teach Epidemiology

play fullscreen
1 / 216
Teach Epidemiology
Download Presentation
Download Presentation

Teach Epidemiology

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Day 4 Teach Epidemiology Professional Development Workshop Centers for Disease Control and PreventionMorgantown, West Virginia June 20-24, 2011

  2. Teach Epidemiology Teach Epidemiology

  3. MMWR http://www.cdc.gov/

  4. Time Check 8:15 AM

  5. Teach Epidemiology Teach Epidemiology

  6. Teach EpidemiologyDay 4Morgantown, WV Diane Marie M St. George, PhD University of MD School of Medicine Dept of Epidemiology and Public Health

  7. EU7: One possible explanation for finding an association is that the exposure causes the outcome. Because studies are complicated by factors not controlled by the observer, other explanations also must be considered, including confounding, chance, and bias.

  8. EU8: Judgments about whether an exposure causes a disease are developed by examining a body of epidemiologic evidence, as well as evidence from other scientific disciplines.

  9. EU9: While a given exposure may be necessary to cause an outcome, the presence of a single factor is seldom sufficient. Most outcomes are caused by a combination of exposures that may include genetic make-up, behaviors, social, economic, and cultural factors and the environment.

  10. Reasons for associations Confounding Bias Reverse causality Sampling error (chance) Causation

  11. Confounding in our lives • Age-adjusted rates of… • Rates of lung cancer adjusted for smoking

  12. Osteoporosis risk is higher among women who live alone than among women who live with others.

  13. Confounding Number of persons in the home Osteoporosis Age • Confounding is an alternate explanation for an observed association of interest.

  14. Confounding • Confounding is an alternate explanation for an observed association of interest. Exposure Outcome Confounder

  15. Confounding • YES confounding module example: • Cohort study • 9,400 elderly in the hospital • RQ: Are bedsores related to mortality among elderly patients with hip fractures?

  16. Bedsores and Mortality RR = (79 / 824) / (286 / 8576) = 2.9

  17. Bedsores and Mortality • Avoid bedsores…Live forever!! • Could there be some other explanation for the observed association?

  18. Bedsores and mortality • If severity of medical problems had been the reason for the association between bedsores and mortality, what might the RR be if all study participants had very severe medical problems? • What about if the participants all had problems of very low severity?

  19. Bedsores and Mortality

  20. Bedsores and Mortality (Severe) RR = (55 / 106) / (5 / 10) = 1.0

  21. Bedsores and Mortality (Not severe) RR = (24 / 718) / (281 / 8566) = 1.0

  22. Bedsores and Mortality stratified by Medical Severity

  23. Bedsores • Bedsores are unrelated to mortality among those with severe problems. • Bedsores are unrelated to mortality among those with problems of less severity. • Adjusted RR = 1, and the unadjusted RR = 2.9

  24. Controlling confounding • Study design phase • Matching • Restriction • Random assignment • Study analysis phase • Stratification • Statistical adjustment

  25. Reasons for associations Confounding Bias Reverse causality Sampling error (chance) Causation

  26. Bias Case Studies • In groups, review the assigned case studies.

  27. Pesticides and cancer mortality In a study of the relationship between home pesticide use and cancer mortality, controls are asked about pesticide use and family members of cases are asked about their loved ones’ usage patterns.

  28. Birth defects and diet In a study of birth defects, mothers of children with and without infantile cataracts are asked about dietary habits during pregnancy.

  29. Types of bias • Selection bias • The process for selecting/keeping subjects causes mistakes • Information bias • The process for collecting information from the subjects causes mistakes

  30. Selection bias • People who agree to participate in a study may be different from people who do not • People who drop out of a study may be different from those who stay in the study • Hospital controls may not represent the source population for the cases

  31. Information bias Misclassification, e.g. non-exposed as exposed or cases as controls Cases are more likely than controls to recall past exposures Interviewers probe cases more than controls (or probe exposed more than unexposed)

  32. Minimize bias Can only be done in the planning and implementation phase Standardized processes for data collection Masking Clear, comprehensive case definitions Incentives for participation/retention

  33. Reasons for associations Confounding Bias Reverse causality Sampling error (chance) Causation

  34. Reverse causality • Suspected disease actually precedes suspected cause • Pre-clinical disease  Exposure  Disease • For example: Memory deficits  Reading cessation  Alzheimer’s • Cross-sectional study • For example: Sexual activity/Marijuana

  35. Minimize effect of reverse causality Done in the planning and implementation phase of a study Pick study designs in which exposure is measured before disease onset Assess disease status with as much accuracy as possible

  36. Reasons for associations Confounding Bias Reverse causality Sampling error (chance) Causation

  37. Sampling error/chance E and D are associated in a sample, but not in the population from which the sample was drawn.

  38. RR in the population

  39. RR in sample 1

  40. RR in sample 2

  41. RR in sample 3

  42. Minimize sampling error (chance) Random selection Adequate sample size

  43. Time Check 9:45 AM

  44. Teach Epidemiology Teach Epidemiology

  45. Time Check 10:00 AM