1 / 31

Does publicly reported stroke data identify which institutions provide better stroke care?

Does publicly reported stroke data identify which institutions provide better stroke care?. Adam G. Kelly, MD May 20, 2010. Disclosures. Financial disclosures: None Unlabeled/unapproved uses disclosure: None. Disclosures. Financial disclosures: None Unlabeled/unapproved uses disclosure:

dsamuels
Télécharger la présentation

Does publicly reported stroke data identify which institutions provide better stroke care?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Does publicly reported stroke data identify which institutions provide better stroke care? Adam G. Kelly, MD May 20, 2010

  2. Disclosures • Financial disclosures: • None • Unlabeled/unapproved uses disclosure: • None

  3. Disclosures • Financial disclosures: • None • Unlabeled/unapproved uses disclosure: • None • Other disclosures: • Faculty member/attending neurologist at Strong Memorial and Highland Hospitals

  4. Outline • Describe the current state of publicly reported stroke quality data • How much data are available to the public? • What is the content of publicly available data? • Evaluate the utility of current data • What are the important attributes to consumers of publicly available data? • Conclusions and recommendations

  5. History of public reporting • Little reporting of outcomes until the 1980s • Release of hospital-based mortality for all Medicare patients, and those admitted with 9 specific diagnoses or procedures in 1986 • Steady increase in the amount of publicly available outcome data • Competition for limited patient pool • Internet availability

  6. History of public reporting • Much of publicly available data is surgical in nature; data for common medical conditions (MI, pneumonia, CHF, etc.) has been growing • Stroke would appear to be a prime candidate for public reporting: • Large public health burden • High morbidity and mortality • Available and validated performance measures

  7. Quality data for stroke • What is the current amount of publicly available stroke quality data? • Data source: Agency for Healthcare Research and Quality (AHRQ) Report Card Compendium • First released in November 2006 • Updated periodically • Free and publicly available

  8. Results • 221 report cards were included in the AHRQ Compendium as of Spring 2008. • 16 report cards were not accessible • From these 205 report cards, 19 (9%) reported data on stroke quality • 17 reported hospital-based data • 16/17 sites combined data for ischemic stroke, intracerebral hemorrhage, and subarachnoid hemorrhage

  9. Quality data for stroke • What is the content of quality data contained in the report cards reporting stroke data? • 5 separate categories of data: • Outcomes • Process • Structure • Utilization • Financial

  10. Results • 17 report cards presented hospital-based stroke quality data • Utilization measures were the most frequently reported type of data (15 sites): • Case volumes, lengths of stay (risk-adjusted) • Outcome measures were reported by 14 sites: • Mortality rates (inpatient out to 180 days), complication rates, re-admission rates • All risk-adjusted

  11. Results • Financial measures were reported by 4 sites: • Costs, charges • Structure measures were infrequently reported (2 sites): • Presence of dedicated stroke unit, number of beds in stroke unit, Joint Commission Stroke Center or state Primary Stroke Center designation

  12. Results • Process measures were reported by a single site (based in UK, not USA): • Use of CT scans, review of cases by neurologist, arrangement for therapy at time of discharge • Patient/family satisfaction with care was not reported by any sites

  13. Summary • Few publicly available report cards provide stroke quality data • Available stroke quality data largely consists of administrative data (utilization, mortality, financial data)

  14. Utility of publicly available data • What are the important attributes to consumers of publicly available data? • Timeliness • Reliability • Sensitive to change in hospital performance • Able to discriminate high and low-performing hospitals • Validity • How do current report cards measure up in these areas?

  15. Timeliness • Currently available quality data ranges in age from 1-3 years old • Some sites use data from 4-5 years prior • Does data in this timeframe truly reflect current hospital performance? • Should strive for making data as real-time as possible

  16. Reliability • 14 report cards provide ratings based on mortality – do they agree on hospital performance? • Frequent disagreement has been noted in surgical report card ratings, though not quantified • What is the agreement rate for report card ratings for stroke care at all New York State hospitals?

  17. Reliability • 157 out of 214 NYS hospitals were evaluated by two separate report cards • Non-profit agency, for-profit corporation • Both report cards use a 3-tiered rating system to evaluate inpatient stroke mortality • One compares hospital mortality to state average; other compares observed hospital mortality to risk-adjusted expected mortality • Ratings were congruent (in agreement) for only 61% of hospitals

  18. Reliability

  19. Reliability • Results from other states with two ratings of inpatient mortality: • Texas – agreement rate 74.3% • Pennsylvania – agreement rate 67.4% • Massachusetts – agreement rate 50%

  20. Reliability • What are the reasons for poor agreement on hospital mortality ratings? • Differences in populations • Different risk-adjustment techniques, statistical techniques • What are the implications of poor agreement? • Poor trust amongst patients/consumers

  21. Sensitivity to change • How frequently do mortality-based hospital ratings change over time? • 20% of NYS hospital ratings changed over one year timeframe • Only 5% of hospitals had their ratings change by both evaluating systems • Are these changes indicative of: • Change in hospital performance? • Change in patient population? • Change in methods of evaluation?

  22. Discriminative ability • Can currently available stroke quality data discriminate low and high-performing hospitals? • Mortality-based hospital ratings based on 95% confidence intervals, comparisons to expected mortality rates • Ratings may be more sensitive for high-volume hospitals; limited ability to discriminate performance in low/medium volume institutions

  23. Discriminative ability Case volume 95% confidence interval Mortality rate Highland, Park Ridge, and Strong Memorial Hospitals are all assigned 2-star (average) ratings; Rochester General Hospital is assigned 1-star (below average)

  24. Validity • Difficult to determine which quality measure is most valid • Unclear which measure is most important to patients, public • What are we hoping to accomplish with public reporting of quality data? • If goal is to increase transparency and better inform patient decisions: • Limited role of financial data • Uncertain role of utilization data

  25. Is mortality a valid measure? • Has many desirable aspects of an endpoint • Definitive/objective • Quantifiable • Clinically relevant • Easily accessible • Should be easily comprehended

  26. Is mortality a valid measure? • Should diseases with markedly different mortality rates be combined? • Ischemic stroke: 8-12% • Intracerebral hemorrhage: 37-44% • Subarachnoid hemorrhage: > 50% • Does it correlate with structure/process of care? • Differences in adherence to performance measures explains < 10% of variations in mortality rates • Distribution of mortality ratings no different among 107 NYS Designated Stroke Centers

  27. Is mortality a valid measure? • Is it a marker of unsafe care? • Unsafe practices are implicated and potentially responsible for < 10% of short-term mortalities • Or is it a marker of patient/family preferences? • Majority of in-hospital deaths on a neurology service are due to patient/family preference to withdraw care • Does it send the correct message? • Reinforces the concept that death is universally and unconditionally a negative outcome

  28. Conclusions • Publicly available stroke quality data is limited in its ability to identify high-performing stroke centers due to: • Narrow scope • Over-reliance on utilization and other administrative data • Lack of real-time data • Inconsistency across multiple sites • Inpatient mortality may not be the most appropriate marker of quality stroke care

  29. Recommendations • Provide separate measures for ischemic stroke, intracerebral hemorrhage, and subarachnoid hemorrhage • Develop methods of reporting data on a more timely basis • Increase the skepticism on mortality as a primary measure of quality care • Separate deaths due to unsafe practices from those due to patient/family preference

  30. Recommendations • Develop a standard set of process measures to be tracked and reported • Harmonized with pre-existing measures recommended by Brain Attack Coalition, Joint Commission, and state-specific guidelines • Examples: IV t-PA consideration/utilization, use of anti-platelets, use of warfarin for AF, DVT prophylaxis, etc.

  31. Recommendations • Incorporate patient/family satisfaction into publicly reported data • Encourage mandatory reporting of all measures

More Related