1 / 68

Building Quality Improvement Partnerships in the VA: The Colorectal Cancer Care Collaborative

Building Quality Improvement Partnerships in the VA: The Colorectal Cancer Care Collaborative. QUERI National Meeting Phoenix, AZ December 2008 George L. Jackson, Ph.D., MHA; Leah L. Zullig, MPH Adam A. Powell, Ph.D., MBA; Diana L. Ordin, MD, MPH Dawn T. Provenzale, MD, MS.

Anita
Télécharger la présentation

Building Quality Improvement Partnerships in the VA: The Colorectal Cancer Care Collaborative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Quality Improvement Partnerships in the VA:The Colorectal Cancer Care Collaborative QUERI National Meeting Phoenix, AZ December 2008 George L. Jackson, Ph.D., MHA; Leah L. Zullig, MPH Adam A. Powell, Ph.D., MBA; Diana L. Ordin, MD, MPH Dawn T. Provenzale, MD, MS

  2. Phase I: Diagnosis • * screening • * presentation with symptoms • through diagnosis Phase II: Treatment • * period from diagnosis of CRC through treatment & follow-up C4: Colorectal Cancer Care Collaborative • Began in 2005 • To assess and improve the quality of colorectal cancer care from screening and diagnosis through treatment

  3. Sept. 2005-Sept. 2006 Oct. 2006- Present March 2007-March 2008 Summer 2005 Collaborative planning Spread of lessons from Phase 1 (diagnosis treatment) Collaborative Phase 1 (screening result diagnosis) Collaborative Phase 2 Time Line

  4. Today’s Workshop • How did we get started on the collaboration? • Overview of Colorectal Cancer Care Collaborative • Measurement challenges • Building a measurement system • Spreading lessons to the VA • Lessens for QUERI investigators

  5. Why C4? Initiated in 2005: • Earlier CMO study suggested timeliness problems • QUERI research results demonstrated gaps in colorectal cancer diagnosis and treatment • OIG report • Congressionally-mandated review of cancer care (GPRA – Government Performance and Results Act) • Colorectal, breast, lung, prostate, hematologic

  6. Colorectal Cancer • Second leading cause of cancer death • Third most common type of cancer among men and women in the United States • 11% of all new cancer cases • 90% five-year survival when diagnosed at stage I • 5% five-year survival when diagnoses at stage IV • Source: VA Colorectal Cancer QUERI Fact Sheet, January 2006 • Source: VA Colorectal Cancer QUERI Fact Sheet, Jan. 2006

  7. Signs &Symptoms InitialScreen + - Repeat CDE* Cancer Adenomas Surveillance Treatment Surgery ChemotherapyRadiation * CDE=complete diagnostic evaluation CRC Continuum

  8. 250 % Patients with 63% + Fecal Occult Blood Test 200 Scheduled for Diagnostic 48% 150 Evaluation 95% 100 45% 23% 64% % Scheduled Completing 50 Diagnostic Evaluation w/in1 Year 0 Facility B Facility C Facility A mean sched delay mean appt time mean completion delay Follow-Up Positive FOBT

  9. Modifiable Risk Factors for Advanced CRC • 549 patients • 43% presented with late stage (stage III or IV) colorectal cancer • The only factor associated with presenting with late stage was not having a usual source of health care • Median patient delay – 9 weeks • Median physician delay – 6 weeks • Stage at presentation was not associated with either patient or physician delay Fisher DA, Martin C, Galanko J, Sandler RS, Noble MD, Provenzale D. Risk factors for advanced disease in colorectal cancer. Am J Gastroenterol 2004;99:2019-2024.

  10. Modifiable Risk Factors for Advanced CRC • Median patient delay – 9 weeks • Median physician delay – 6 weeks • Stage at presentation was not associated with either patient or physician delay Fisher DA, Martin C, Galanko J, Sandler RS, Noble MD, Provenzale D. Risk factors for advanced disease in colorectal cancer. Am J Gastroenterol 2004;99:2019-2024.

  11. OIG Report: CRC Detection and Management in VHA Facilities Feb. 2006 • Metrics to evaluate and improve CRC dx timeliness • Prioritization process for dx c-scopes • Directive addressing timeframes • Pt notification of screening results within 7 working days • Consistent notification and documentation requirement for dx testing

  12. OQP Vision • Measures and measurement tool development (QUERI/HSR&D) • Pilot collaborative project to identify and develop improvement strategies/tools (OQP/SR) • National dissemination of project (SR/OQP) • Monitors or Performance Measures to create “pull” for improvement • Ongoing support to facilitate sharing, identification of additional effective strategies/tools

  13. OQP Vision • Partnership among OQP, researchers, PCS, Advanced Clinical Access/Systems Redesign • Strong, ongoing evaluation component

  14. Anticipated Challenges • Measurement challenges • Improvement challenges • Dissemination challenges • Two phases: diagnosis and treatment • Project infrastructure • New partnership model • “Just-in-time” planning • Pace and design of project • Sense of urgency • Cultural “clashes” • Research vs. operations • Anecdote vs. evidence

  15. Anticipated Outcomes and Products • Measurement • Standardized facility-level approaches for QI measures • Real-time measurement tools • Documentation of barriers to national measurement • Improvement tools/strategies • Dissemination mechanism • Improvement beforeexternal review published • Lessons on how to do this better next time • Project organization and partner roles • C4-type collaborative

  16. The Partnership • Quality Enhancement Research Initiative (QUERI) • CRC expertise in measurement and improvement • Office of Quality and Performance (OQP) • Performance measurement expertise • Quality improvement expertise • Systems Redesign • Expertise in delay reduction • National infrastructure, experience, and tools • Patient Care Services • Clinical expertise • Link to VA clinical constituencies

  17. C4 Planning Committee • Organizes the collaborative • Includes representatives from all partner organizations and other VA collaborative experts • Subcommittees • Measurement Issues • Collaborative Operations • Dissemination

  18. Optimizing the Partnership • Dialogue is critical! • Initial QUERI-provided measures were critiqued by the field • C4 works with the field to develop better measures • Some may inform national data systems and some may remain local improvement tools • OQP, DUSHOM and VISN CMOs provide continued support

  19. Changing Systems

  20. C4 Learning Collaboratives • 21 volunteer facilities (one per VISN) in diagnosis collaborative • 28 volunteer facilities (at least one per VISN) in treatment collaborative • Collaborative: structured, sharing with rapid cycle improvement • Planning and facilitation by partner organizations with the involvement of many VA stakeholders

  21. VISN 6 Beckley, WV VISN 7 Columbia, SC VISN 8 San Juan VISN 9 Lexington, KY VISN 10Columbus VISN 11Northern Indiana VISN 20 Portland VISN 21 San Francisco VISN 22 Loma Linda VISN 23 Black Hills, SD Diagnosis Collaborative21 Improvement Teams VISN 1 Providence VISN 2 Buffalo VISN 3 New Jersey VISN 4 Pittsburgh VISN 5 Washington VISN 12 Chicago (Hines) VISN 15 St. Louis VISN 16 Houston VISN 17 Temple VISN 18 West Texas VISN 19 Salt Lake City

  22. VISN 6 Beckley, WV Salisbury, NC VISN 7 Columbia, SC VISN 8 Gainesville VISN 9 Lexington, KY VISN 10Dayton VISN 11Northern Indiana VISN 20 Portland Puget Sound VISN 21 San Francisco VISN 22 Loma Linda San Diego VISN 23 Black Hills, SD Nebraska/W. Iowa Treatment Collaborative28 Improvement Teams VISN 1 Providence VA Connecticut VISN 2 Buffalo VISN 3 New Jersey VISN 4 Pittsburgh Lebanon, PA VISN 5 Washington VISN 12 Chicago (Hines) VISN 15 St. Louis VISN 16 Houston VISN 17 Temple VISN 18 West Texas Albuquerque VISN 19 Salt Lake City

  23. C4 Learning Collaborative Process • Flow-mapping and initial data collection • QUERI measurement using CPRS data • Local measurement • Setting aims • Plan-Do-Study-Act (PDSA) cycles • Coaches aid in the improvement process • Collaborative sharing via in-person meetings, monthly national calls, monthly reports to coaches and senior leaders, updates to VA leadership, website, and listserv

  24. Collaborative Process Team selection and commitment In-Person Meeting -Flow mapping -Baseline measures -Aim setting Plan-Do-Study-Act (changes and measurement) Reports to C4 and leadership Structured sharing (e.g national calls) PDSA Dissemination

  25. C4 Team Composition • Facility Management • Facilities volunteered for the collaborative • Applications signed by the medical center director, chief of staff, and nursing executive • Sites chosen to provide size, complexity, geographic diversity • Team Formation • Teams include physicians, nurses, and other representatives from the involved clinical services • Designated project manager • Information technology representative

  26. C4 Team Activities • Flow-mapping and initial data collection • Setting aims • Plan-Do-Study-Act (PDSA) cycles • Coaches aid in the improvement process • Collaborative sharing via in-person meetings, monthly national calls, monthly reports to coaches and senior leaders, updates to VA leadership, website, and listserv

  27. Act Plan Study Do Model for ImprovementPDSA – Rapid Cycle Improvement What are we trying to accomplish? How will we know that a change is an improvement? What changes can we make that will result in an improvement? PDSA slides courtesy of Jim Schlosser, MD, MBA

  28. Act Plan • Objective • Questions and • predictions (why) • Plan to carry out • the cycle (who, • what, where, when) • What changes • are to be made? • Next cycle? Study Do • Complete the • analysis of the data • Compare data to • predictions • Summarize what • was learned • Carry out the plan • Document problems • and unexpected • observations • Begin analysis • of the data The PDSA Cycle for Learning and Improvement

  29. D S P A A P S D D S P A A A P P S S D D Examples of PDSA Cycles Improved access Data Cycle 5: Implement standards and monitor their use Cycle 4: Standardize appointment types and test their use Cycle 3: Test the types with 1-3 providers’ patients Reduction of appointment types will increase appointment availability Cycle 2: Compare requests for the types for one week Cycle 1: Define a small number of appointment types and test with staff

  30. A A A A P P P P S S S S D D D D D D D D S S S S P P P P A A A A A A A A A A A A P P P P P P P P S S S S S S S S D D D D D D D D Overall Aim: improve timeliness, reliability and patient focus of CRC treatment Improve pathology reporting Shorten staging work-up Improve treatment concordance Improve patient education Testing Multiple Changes

  31. Systems Redesign/Advanced Clinic Access • Scientifically based principles • System/process redesign • Everything improves • Requires measurement • Delay elimination http://srd.vssc.med.va.gov

  32. Access Match supply & demand daily Reduce the backlog Decrease appointment types Develop contingency plans Reduce demand Increase supply/ Optimize the team Office Efficiency Balance supply & demand for non-appointment work Synchronize patient, provider, & information Predict & anticipate patient needs Optimize rooms & equipment Manage constraints High Leverage Changes to Eliminate Delay

  33. Service Agreements Purpose • Specialists can’t do everything best • PC can’t do everything best • Best utilization of resources Elements • Define the work. • It is not “NO” work • It is not “ALL” work • It is the work that only I can do (colonoscopy) 2. The sender agrees to send the right workpackaged the right way. • Referral templates • Guideline driven • All the information to safely complete the procedure 3. The receiver agrees to do the work right away

  34. Patient doesn’t return cards No Annual Eligible Procedure FOBT Positive ? Yes population Consult screen Long waits , F / U 10 years backlog No Schedule Do Refer for Definitive Cancer ? Yes procedure colonoscopy treatment treatment No show Inadequate prep CRC Screening ProcessProblems All Along the Path

  35. CRC Dx Improvement Strategies • Decrease inappropriate screening • Strengthen service agreements/consult templates • Improve patient colonoscopy prep • Track positive screens to ensure follow-up • Fee-base or contract to get rid of backlog • Add permanent staff • Other (LOTS!!!)

  36. CRC Tx Improvement Strategies • CPRS enhancements (clinical reminders, quick orders, templates) are useful to ensure guideline reliability/timeliness • Cancer care coordinators streamline process for patients, monitor care, and can maintain contact for lengthy surveillance

  37. C4 Data Collection • Phase I • Baseline data • Two process evaluation surveys • Qualitative interviews with site team leaders • Outcome data (to be collected) • Phase I Spread • National facility survey • National success case method interviews • Monitor data • Phase II • CCQMS dataset • Pre-intervention assessmentOrganizational readiness to change • Process Evaluation Survey

  38. Process Change

  39. What have been the most significant barriers to improvement?

  40. Measurement Challenges • Develop a timely measure of timeliness • Establishing a reasonably short hurdle (% receiving follow-up in 60 days) better than mean/median time to follow-up • Ideally the same measure(s) will be useful both within facilities (QI) and between facilities (evaluation)

  41. Local FOBT Tracking Tool Features: • Ease of input • Tracks most relevant indicators of improvement • Generates run charts • Adaptable to evolving data needs • Facilitates reporting of FOBT Follow-up monitor data

  42. FOBT Follow-up Monitor • Self-reported • Currently lack of standardization within VistA across facilities makes centralized collection of monitor data difficult • Tradeoff between rigor and data collection burden on sites • Generated from local QI tracking system(Most use nationally developed FOBT Tracking tool) • Evolving as definitional issues encountered

  43. FOBT Follow-up Monitor Which FOBTs should be included? • Inappropriate screening? (e.g. recently screened, limited life expectancy) • Patient refusals? • Patients going outside of VA for follow-up?

  44. FY09 FOBT Follow-up Monitor Proportion of patients with a positive colorectal cancer (CRC) screening FOBT with diagnostic colonoscopy < 60 days after the positive screening FOBT. • Numerator: Those in denominator who had complete diagnostic colonoscopy < 60 days after a positive CRC screening FOBT • Denominator: Number of patients with a positive CRC screening FOBT in the measurement month • Exclusions: • Patients who refuse follow-up colonoscopy • Patients who choose to have follow-up colonoscopy outside (i.e., neither performed nor paid for by) the VA • Patients determined to be clinically “inappropriate” for colonoscopy • Patients who have had a previous positive FOBT in the FY09 • Patients whose FOBT was not performed as a CRC screening FOBT.

  45. Monitor Validation • Planned independent assessments: • C4 Post-intervention Evaluation • Partin grantDETECT – “Determinants of Timely Evaluation Colonoscopy for crc+ screening Tests” • Powell CDA manual chart review project • EPRP abstraction

  46. Colorectal Cancer Care Measurement System These measures, when mapped to NCCN Guidelines, will: • Identify facility level gaps in care to patients • Identify facility level deviations from established standards of patient care • Identify systemwide gaps in care to patients • Identify systemwide deviations from established standards of patient care

  47. CCQMS Development Process • Solicited input from VA constituencies • Office of Patient Care Services • Oncology Field Advisory Committee • Team of members at participating sites • Developed specific quality indicators and measures • Sample quality indicator: proportion of patients with resected colon cancer with ≥ 12 lymph nodes examined by pathology • Indicators and measures form the basis for computerized measurement and analyses tools

  48. CCQMS Development Process • Facilities collect measurement data from VA computer systems • Information to C4 participants • During the improvement collaborative, facility and VA-wide reports are being produced • A goal is to increase data extraction capabilities during the time of the collaborative • Potential to serve as a model for other cancer care quality measurement efforts

  49. Individual Site National Host(HSR&D / OQP) through CCQMSreporting feature CCQMS Operational Design

  50. CCQMS Data Entry

More Related