Download
revising the current flagging method for transplant program post transplant performance reviews n.
Skip this Video
Loading SlideShow in 5 Seconds..
Revising the Current Flagging Method for Transplant Program Post-Transplant Performance Reviews PowerPoint Presentation
Download Presentation
Revising the Current Flagging Method for Transplant Program Post-Transplant Performance Reviews

Revising the Current Flagging Method for Transplant Program Post-Transplant Performance Reviews

114 Vues Download Presentation
Télécharger la présentation

Revising the Current Flagging Method for Transplant Program Post-Transplant Performance Reviews

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Revising the Current Flagging Method for Transplant Program Post-Transplant Performance Reviews Membership & Professional Standards Committee Fall 2013

  2. The Problem • Current flagging method • Identifies too many low volume programs • Fails to identify many medium volume programs • Movement to Bayesian methodology

  3. Goal of the Proposal • Identify programs that are truly underperforming • Increase the probability that underperforming programs are flagged • Decrease the probability that average programs are flagged by mistake

  4. How the Proposal will Achieve its Goal • Use of Bayesian methodology by SRTR • New thresholds for flagging programs that perform>9 transplants in 2.5 years • P[HR>1.2]>75%, or • P[HR>2.5]>10% • Small volume <10 transplants – 1 event within 2.5 year cohort

  5. Supporting Evidence • Bayesian methodology supported by • Committee of Presidents of Statistical Societies (COPSS) • Consensus Conference on Transplant Program Quality and Surveillance • SRTR Technical Advisory Committee • SRTR calculated flagging rates for 57,915 possible Bayesian flagging algorithms

  6. Supporting Evidence

  7. Supporting Evidence • Bayesian flagging algorithms should… • Have fewer false positive flags for small volume programs • Have more true positive flags for medium to larger volume programs

  8. Supporting Evidence • Historical review using January 2008 cohort data • Bayesian example used for this review • Slightly different than proposed algorithm • Proposed more effective at increasing true positives and decreasing false positive

  9. What Members will Need to Do • Flagging methodology is screening mechanism to identify programs that merit further inquiry • No change to data reported by program or used by SRTR • No change to process of review by MPSC

  10. Questions? • Carl Berg, M.D., Committee Chaircarl.berg@duke.edu • Region # Representativename@email • Sharon Shepherd, Committee Liaisonsharon.shepherd@unos.org • Tabitha Leighton, SRTR Representativesrtr@srtr.org

  11. Request for Inactivation/Withdrawal Language • Codifies current practice of MPSC—transparency • Rarely used – patient safety implications • Generally inactivation while improvements implemented