1 / 40

THE BEST OR BEST FEASIBLE CAUSAL METHODS FOR PCOR: CONTROVERSIES IN THE FIELD

Explore the controversies in Patient-Centered Outcomes Research (PCOR) and the best feasible causal methods for conducting PCOR studies. Discuss the importance of PCOR, the role of negative and positive doctors, and what patients really want. Learn about Comparative Effectiveness Research (CER) and the pros and cons of different study approaches. Examine the challenges in experimental and observational research and the need for replication and responsible research conduct.

hlinder
Télécharger la présentation

THE BEST OR BEST FEASIBLE CAUSAL METHODS FOR PCOR: CONTROVERSIES IN THE FIELD

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. THE BEST OR BEST FEASIBLE CAUSAL METHODS FOR PCOR: CONTROVERSIES IN THE FIELD Heejung Bang, PhD UC-Davis

  2. Why PCOR? • Me as a negative/null/lazy researcher-patient & co-I of a PCORI trial, my personal and honest feelings about PCness & PCOR.

  3. Episode 1: Negative doctor • I call my kid’s pediatrician=negative doctor. • His famous quote: “Does it work?” He tends to be negative. When he is positive, I can follow his advice. • Examples in medical decision making: - Ex1: HPV vaccine [he said: I did not believe but I believe more now.] - Ex2: Flu shot [he said: My kids take it. Miss school 3 days, instead of 5.]

  4. Episode 2: Positive doctor • I call my PCP=positive or per-protocol doctor. • I like her shared & evidence-based decision making style. • But I don’t say hi to my docs/vet on the street…. If too close, hard to say No or lie, hide, or ask uncomfortable questions. e.g., It didn’t work; How much is it?; Covered by insurance? [I know PCORI does not like cost & CEA.] • How many patients can say No, “I don’t want to pay” or “Hey Doc, I don’t like you”.

  5. Episode 3: PCOR – hype or hope? • PCness is important b/c - “If you die, do you think your doc will cry?” - “Your health is yours.” - I ask my doc “Do you take vitamin D too?” • Dilemma? • If you are Patient-centered, I am Patient-uncentered? • What patients really want… • Patients like Dollar store/Costco, simple questions & answers (good or bad?) and 3 bullets.

  6. What Patients really want? • Coupon/voucher/no copay? • Do we tell truth to doctors and researchers? • 3 independent opinions • <30 mins in clinic, or email/text/phone (again no copay) • Mail order or vending machine for HIV meds? • Healthy dinner or Total budget/N instead of CRT participation? • If you will do bad thing anyway, do it in a better or cleaner way (e.g., syringe exchange program, big water pipe, baby box) • No follow-up? - Don’t tell me what to do (I am 77 years old).

  7. Nothing New under the Sun • Quote for PCOR? The good physician treats the disease; the great physician treats the patient who has the disease. … Osler • Medicine meets Statistics/Policy To understand God’s thoughts, we must study statistics for these are the measure of His purpose … Nightingale

  8. Comparative Effectiveness Research (CER): What & why? • My “working” definition/properties: • A vs. B vs. C or AA vs. Aa • pros & cons (e.g. fleas vs. mice; worrying vs. radiation/planB) • short & long term • when/where/whom/for what? • In all comparisons: ‘fairness’ & contexts are key. Or Doomed? • Some consider (in publication): CER = large observational data + causal inference

  9. CER approaches: RCT vs. Observation • Hardcore: RCT only (esp., large or multi-center) • Meta next? (law, sausage… & meta) • Observations can join? The use of outcomes research (divorced from random allocation..) … provide no useful means of assessing the value of a therapy…. Doll • Some fields produce more false results than others… Cook • How often you experiment and you observe?

  10. Why cats have …Diamond (Nature1988) RCT? You first

  11. Double-Blinding …to prevent lawsuit?

  12. Lower rigor & more freedom vs. Higher glory; Low vs. high stakes: Which is more dangerous?* • Experiment & Observation are complementary; observation experiment observation • HRT & tHcy controversies: Cases not closed • pros/cons, for what; when/where; treatment not fixed/alone • Shall I take? If so, when/how long? Revisit evidence then? • CVD vs. Ovarian cancer/skin/hair *personal: RCT (& edu) may not be superior.

  13. Personal vs. General: best of both worlds Distinguished cardiologist likes: Personalized/Precision medicine* I like: Generalized medicine (e.g., aspirin** in dollar store) *Multiple reg, subgroup, adaptive, lab/genes **cardiologist vs. surgeon vs. mom

  14. High-hanging or we live in 2% world? (Huo JAMA 2015) Another issue: Most US trials showed HR=1.01 or p=0.99?

  15. Knowledge & Garbage correlated? • If we google “Crisis in Science”, statistics is a big part. • Reproducibility (Replication) & Responsible Research Conduct • undoubtedly important but aren’t you SAD? • Knowledge is Power; No knowledge is Medicine. • Surgeon/pilot/FDNY: sleep more > study? We must finally rely, as have the older sciences, on replication… Cohen

  16. Question/Goal – first & foremost • Step 0 in Research - Everyone teaches: Ask rightquestion…………. (easy?) • My favorite question & criticism: • Do you think I can answer my question with my Excel data? • You don’t know what you are doing?.... Robins • Causal effect of ‘2 glasses of wine’ a day • Compared to what? • In/excluding potential DUI? • Mortality, cancer, MI, cirrhosis, QoL, car accident? • Total, net, conditional, direct, indirect effects? • Consuming alcohol means reasonably healthy?

  17. Coffee & Mortality (…1960-2015…) • What kind of people exactly drink 1 cup (8oz) a day? • “Coffee+golf+PRADA” subgroup lives longest? • Association ≠ Causation = I know, I know. • You don’t need to start drinking coffee. But what if I mimic coffee-drinkers’ lifestyles? • “I've seen so many contradictory studies with coffee that I've come to ignore them all.” …. Berry • For me: I will drink coffee no matter what; Man's Search for Meaning • For my kid: Don’t start coffee. You will become a slave… addictive like mj or K-drama….

  18. One vs. Many questions • Bonferroni’s iron claws • No aphorism is more frequently repeated in connection with field trials, than that we must ask Nature few questions, or, ideally, one question, at a time. The writer is convinced that this view is wholly mistaken…. R.A. Fisher (i.e., Main & Interaction effects*) • No free lunch – to ask or answer more, we must pay… *Term “effect”=original sin? Interaction vs. effect modification

  19. Ideal vs. Reality • Make everything as simple as possible, but not simpler… Einstein • If we knew what it was we were doing, it would not be called research… Einstein • The predominant view in the profession is that there’s one particular way of doing economics. It’s basically to set up some mathematical model, the more complicated, the better… Chang • We need 2methods; 1for method papers, 1for real use.

  20. Q1: Minimum wage is good for economy? Q2: Tutoring is good for students? • Maybe, yes overall. Good is good. Everybody says so. • Answer depends on when/where/whom/dose. • These are not feasible options (or too luxurious) for some countries/students, so not relevant to everyone. • Patients’ questions are often vague or too simple; We don’t care about ACE/LATE/ATT/CDE/NDE.

  21. Causal Ultimatum A difference is a difference only if it makes a difference.

  22. Theory vs. Data • Data, Data, Data… S Holmes • Evidence without data is not evidence… study329.org • A few observation & much reasoning lead to error; many observations & a little reasoning to truth…Carrel • Retrospective rationalization… S Young • Without a trace (data analysis) • I am a firm believer, that without speculation there is no good & original observation… C Darwin • We already know the answer… Econ joke

  23. Design/Method/Analysis - next • Design &analysis should go hand in hand (whenever possible). • Causal inference is often considered/conducted after study design or data collection is completed. • Understandably, statistical method is often selected by software availability/familiarity/tradition/ego. • Different fields & camps promote different methods. And do not communicate well.

  24. Common deciders in method selection - At the end of the day, My method is the best. - Pressure to please reviewers/funders - med/epi vs. stat vs. econ vs. social/psych/nursing (stat vs. biostat) - PS vs. MSM vs. IV vs. SEM vs. older - logit vs. probit - try all….. smallest p • Each lab develops own code? • Fortran/C++/Python in JASA & SAS/R/STATA later?

  25. Lazy researcher-patient’s perspective • Life is short. • If I (=stat PhD) suffer, many users suffer too. • Google SAS macro – cheap & useful ( expensive but QC included) • Typical users/shoppers read title/abstract/manual only. • Long Material-Method section Can we go to Conclusion directly?.... my colleague • Realities: • No free lunch; Causal methods entail multi-steps & checking, more assumptions, larger N.

  26. You’ve Got Mail • Nurse wrote: I have looked at several tools, and have found the one you developed to be very clear and easy to use. My study population is of lower socioeconomic and education level, and I believe the tool will work well in that population. • Some tools: Of, By, For Clinicians • A doctor’s feedback: Of course, this tool is for doctors. But when we use numbers (e.g., probability or risk), patients can understand or be convinced better.

  27. Instrumental Variable (IV) • Most beloved by Economists • In my stat training, I saw IV in Measurement error & Noncompliance. • Econ/Social see IV in regression, and Epi see in DAG. • Beyond Wright brothers: Theory of IV was first derived by P. G. Wright, possibly in co-authorship with his son S. Wright, in his 1928 book The Tariff on Animal and Vegetable Oils. In “Appendix B”! vs. Path analysis was developed by S. Wright in ~1918.

  28. IV (conti’) • When OLS fails. • Initial Qs: Do you know “a factor associated with Xbut not with Y”? Only thru exposure. [Interestingly, epi search for opposite properties!] • Devil: Where is IV? I have 0, but you have 5? - Epidemiologist’s dream? Cautionary tale? - Strongest assumptions? Hit or miss? • Homerun example: Coin • Popular choices: cigarette price/tax, distance to hospital, practice pattern, (previous blood pressure)

  29. DAG & math

  30. Beyond IV; causal within non-causal? • Diff-in-Diff, Interrupted time series - Impact on policy (e.g., Obamacare) • Quasi-, Natural-experiment, Regression-discontinuity • Aggregate/Ecological data often used • Grander-causality for time-series (got Nobel). - GNP & prices cause sunspots!

  31. Time will tell …or dream? Hernan & Robins 2006

  32. Propensity Score (PS) • Most beloved by Epi/Med • Initial Qs: Do you want to adjust 20 Xs in regression? Who are treated? Not 50%. • Key properties: Associated with X as well as Y • Devil: No unmeasured confounder • Homeruns: Many or none, who take statins in real life (e.g., older, high BP & chol, MI hx) Confounders are easier concept in med/epi. • Common practice: logistic regression (0/1) on many Xs. • Advantage: traditional methods (match/strata/reg/wgt) under one umbrella • PS is phony & IV is lousy?

  33. Marginal Structural Model (MSM) • Key feature: “Marginal” effect directly. • PS’s sister?: It uses PS via (inverse) weighting. • In short: MSM~=IPW~=PS-weighting? • By weighting, a population mimicking a RCT is generated; Farewell to Confounding • Weighting popular in survey sampling & missing data: 1 person can represent >1 persons. • Advantage: Time-dependent exposure/confounder, longitudinal, survival • Devil: 1/(1/1000)=1000

  34. Structural Equation Modeling (SEM) • Concept/theory/framework/construct • Need right types of variables • Factor, path analysis, latent growth model, etc. • Beloved by Social/Psych/Nursing/Spatial • Lots of arrows = DAG + #s ….. (not easy for me) • Causal enough or Still observational? Too theory-driven? • Many named models e.g., Judd-Kenny, Sobel, Anderson, Donabedian, etc. • NEJM/JAMA/Lancet/FDA accept these methods? • Path analysis has changed its name to SEM... Elston

  35. DAG, SEM, IV or ancestry.com?

  36. Same Mediation, Different Players? • In Social/Psych: Baron-Kenny, Sobel-MacKinnon (e.g., Sobel is as easy as A-B-C) https://en.wikipedia.org/wiki/Sobel_test http://www.psychwiki.com/wiki/Mediation • In Epi/Med: VanderWheele/Robins (Complicated & long macro: more rigorous or better?)

  37. We evolve or flip-flop? • Emergency Room Emergency Department • What is Directionality?... Greenland & Morgenstern (1989) • Confounding vs. Non-collapsibility • “Comparability” died? Exchangeability in epi & Ignorability in stat • Confounding both exists & does not exist in the same study. • Translational research, BigData, Data Science: new-born? • Why Statistics do not teach Diff-in-Diff? (0/1*0/1) • segmented ≈ broken stick ≈ //surveillance.cancer.gov/joinpoint/ G-computation vs. G-formula vs. G-estimation vs. G-causality (G-test)

  38. My humble pie… • Common practice: • 1 old + 1 fancy or 1 fancy only • PS/MSM for cancer/NEJM vs. t-test for cosmetic surgery • 303 studies suffer same problems + meta next? • Since causal methods heavily rely on assumptions & complex implementation, ideal to implement different methods relying on whole new assumptions. • Sensitivity, consistency & generalizability/transportability (e.g., bounds, 2 datasets better than 1). • In simulation, we cook; in validation, we max.

  39. Researchers should be humbleand always look or wait for experimental & biological evidence. • “Honest” reporting of inconvenient truths/limitations - No method is Perfect (or enemy of Good). - Hope reviewers are more open-minded, minimal penalty - Honesty is the best policy in pub & grant? - Unknown is Unknown (or Unknowable). - Difficult in competition ─ how to sort; where to cut. • More pre-specification, protocol   Permeation of statistical research with experimental spirit… Greenwood/Hill • Validate or Perish; Publish & Run.

  40. Smoking Erases Y Chromosomes; Y Quit?(Sci 2015) - Tantalizing results; mechanism unclear - Used 3 independent cohorts - No fancy method; basic stat such as box plot & K-S test (of course, small effect sizes) - Design really trumps analysis? - New data vs. Bigger data • Causal Inference • Do we know just enough to be dangerous, Too clever by half? • Unmeasured is unmeasured. • It is a missing data problem. t-test too?

More Related