1 / 52

Clinical Insights into Implementing the AHRQ Indicators for Hospital Quality Improvement

Clinical Insights into Implementing the AHRQ Indicators for Hospital Quality Improvement. Patrick S. Romano, MD MPH Professor of Medicine and Pediatrics UC Davis School of Medicine AcademyHealth 2006 Child Health Services Research Meeting June 24, 2006. Acknowledgments. Funded by AHRQ

seamus
Télécharger la présentation

Clinical Insights into Implementing the AHRQ Indicators for Hospital Quality Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Clinical Insights into Implementing the AHRQ Indicators for Hospital Quality Improvement Patrick S. Romano, MD MPH Professor of Medicine and Pediatrics UC Davis School of Medicine AcademyHealth 2006 Child Health Services Research Meeting June 24, 2006

  2. Acknowledgments Funded by AHRQ Support for Quality Indicators II (Contract No. 290-04-0020) • Mamatha Pancholi, AHRQ Project Officer • Marybeth Farquhar, AHRQ QI Senior Advisor • Mark Gritz and Jeffrey Geppert, Project Directors, Battelle Health and Life Sciences Data used for analyses: Nationwide Inpatient Sample (NIS), 1995-2003. Healthcare Cost and Utilization Project (HCUP), Agency for Healthcare Research and Quality State Inpatient Databases (SID), 1997-2003 (38 states). Healthcare Cost and Utilization Project (HCUP), Agency for Healthcare Research and Quality

  3. Acknowledgments We gratefully acknowledge the data organizations in participating states that contributed data to HCUP and that we used in this study: the Arizona Department of Health Services; California Office of Statewide Health Planning & Development; Colorado Health & Hospital Association; Connecticut - Chime, Inc.; Florida Agency for Health Care Administration; Georgia: An Association of Hospitals & Health Systems; Hawaii Health Information Corporation; Illinois Health Care Cost Containment Council; Iowa Hospital Association; Kansas Hospital Association; Kentucky Department for Public Health; Maine Health Data Organization; Maryland Health Services Cost Review; Massachusetts Division of Health Care Finance and Policy; Michigan Health & Hospital Association; Minnesota Hospital Association; Missouri Hospital Industry Data Institute; Nebraska Hospital Association; Nevada Department of Human Resources; New Jersey Department of Health & Senior Services; New York State Department of Health; North Carolina Department of Health and Human Services; Ohio Hospital Association; Oregon Association of Hospitals & Health Systems; Pennsylvania Health Care Cost Containment Council; Rhode Island Department of Health; South Carolina State Budget & Control Board; South Dakota Association of Healthcare Organizations; Tennessee Hospital Association; Texas Health Care Information Council; Utah Department of Health; Vermont Association of Hospitals and Health Systems; Virginia Health Information; Washington State Department of Health; West Virginia Health Care Authority; Wisconsin Department of Health & Family Services.

  4. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Validate or test key hypotheses (research) • Is it a coding/documentation issue? • Is it a quality issue? • Questions and answers

  5. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Validate or test key hypotheses (research) • Is it a coding/documentation issue? • Is it a quality issue? • Questions and answers

  6. Norton Healthcare Quality Report We don’t have to do this, but … In a spirit of openness and accountability, we will show the public our performance on nationally endorsed lists of quality indicators and practices. • Not: invent or choose indicators that make us look good • Not: hide or redefine indicators that make us look bad

  7. >270 indicators + safe practices • National Quality Forum (NQF) • Hospital care • Adult cardiac surgery • Nursing-sensitive indicators • Safe practices • Shell in place for ambulatory indicators • JCAHO • JCAHO/CMS adult core measures • National patient safety goals • AHRQ • Patient safety indicators (PSIs) • Inpatient quality indicators (IQIs) • Others (e.g., pediatric ORYX, NICU mortality) Also: financials, patient satisfaction

  8. “How we use PSIs and IQIs” • Publicly report rolling 12 months • Risk-adjusted (not smoothed) rates straight from AHRQ software. Period. • Use KY hospital discharge database, despite limited # of diagnosis codes • Create service line report cards (only that patient population; no U.S. benchmark)

  9. Norton Healthcare Surgery Report Card Red or green if outside 99% C.I. based on U.S. September 14, 2005 posting

  10. “Why we did it” • Accountability as a public asset • Clinical care is, in fact, our “widget” • We talk about our financials with bond raters, the press, etc.; why not our clinical performance? • Proactively influence the the public reporting arena • Clinical over purely financial • Transparent over proprietary • Evidence based over arbitrary • Get the organization moving in a direction that is inherently inevitable • Improve our care; “We’ll manage what we measure and report”

  11. Impact of implementing the Norton Healthcare report card • We are still in business. • Better data; less time arguing about the measure and more time improving performance. • Unused data never become valid. • Even a lousy indicator can drive improvement. • Limited public reaction • Mostly favorable physician response • Strong desire to be “within normal limits”

  12. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Validate or test key hypotheses (research) • Is it a coding/documentation issue? • Is it a quality issue? • Questions and answers

  13. 2005 National Reports on Quality and Disparities

  14. National trends in PSI rates, 2000-2003 KIDExtremely rare events (<0.01%) Kids’ Inpatient Database 2000 and 2003. AHRQ Healthcare Cost and Utilization Project. AHRQ PDI Version 3.0b Unadjusted Rates.

  15. National trends in PSI rates, 2000-2003 KIDRare events (0.01-0.1%) Kids’ Inpatient Database 2000 and 2003. AHRQ Healthcare Cost and Utilization Project. AHRQ PDI Version 3.0b Unadjusted Rates.

  16. National trends in PSI rates, 2000-2003 KIDLow-frequency events (0.1-0.5%) Kids’ Inpatient Database 2000 and 2003. AHRQ Healthcare Cost and Utilization Project. AHRQ PDI Version 3.0b Unadjusted Rates.

  17. National trends in PSI rates, 2000-2003 KIDMedium-frequency events (0.5-5.0%) Kids’ Inpatient Database 2000 and 2003. AHRQ Healthcare Cost and Utilization Project. AHRQ PDI Version 3.0b Unadjusted Rates.

  18. National trends in PSI rates, 2000-2003 KIDHigh-frequency events (>5.0%) Kids’ Inpatient Database 2000 and 2003. AHRQ Healthcare Cost and Utilization Project. AHRQ PDI Version 3.0b Unadjusted Rates.

  19. National trends in PSI rates, 2000-2003 KIDPotentially avoidable hospital conditions Kids’ Inpatient Database 2000 and 2003. AHRQ Healthcare Cost and Utilization Project. AHRQ PDI Version 3.0b Unadjusted Rates.

  20. Area Level PDI by Geographic Region Kids’ Inpatient Database 2003. AHRQ Healthcare Cost and Utilization Project. AHRQ PDI Version 3.0b Risk-adjusted Rates.

  21. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Validate or test key hypotheses (research) • Is it a coding/documentation issue? • Is it a quality issue? • Questions and answers

  22. NACHRI Pediatric Patient Safety Indicator (PSI) Collaborative • Ran the AHRQ PSIs on NACHRI’s Case Mix database, containing 3 million discharges from approximately 70 children’s hospitals. • Developed the NACHRI Pediatric PSI Collaborative, a self-selected group of 20 hospitals interested in pursuing this analysis further • Published a manuscript entitled “Relevance of the AHRQ PSIs for Children’s Hospitals” in the January 2005 journal Pediatrics. • Developed and released a Patient Safety Indicator Toolkit (available through NACHRI’s website) with sample press release, op ed, Q&A, and background documents for hospitals to educate their communities and the media on the relevance and utility of PSIs for pediatrics. • Developed an online, secure chart review tool that allowed Collaborative participants to review the preventability of patients flagged as having any of 11 selected PSI events. • Fostered a relationship with AHRQ and Stanford/UC Davis to update each other on NACHRI’s findings and the PedQI development work.

  23. NACHRI Pediatric Patient Safety Indicator (PSI) Collaborative Collaborative Participants • AL / Children’s Hospital of Alabama / Dr. Crayton Farguson* • CA / Lucile Packard CH at Stanford / Dr. Paul Sharek* • CA / UC-Davis / Dr. James Marcin** • DC / Children’s National Medical Center / Dr. Tony Slonim* • CA / Mattel Children’s at UCLA / Ms. Mary Kimball** • FL / All Children’s / Dr. Jack Hutto* • KY / Kosair Children’s Hospital / Dr. Ben Yandell* • LA / Children’s Hospital New Orleans / Ms. Cindy Nuesslein* • MD / Johns Hopkins Children’s Center / Dr. Marlene Miller* • MA / Children’s Hospital Boston / Drs. Daniel Nigrin and Don Goldmann • MI / C.S. Mott Children’s Hospital – U Mich / Dr. Aileen Sedman* • MO / Children’s Mercy Kansas City / Dr. Cathy Carroll* • OH / The Children’s Medical Center Dayton / Dr. Thomas Murphy* • OH / Cincinnati Children’s Medical Center / Drs. Uma Kotagal, Joseph Luria* • OH / Children’s Hospital Columbus / Dr. Thomas Hansen* • OH / Children’s Hospital MC of Akron / Dr. Michael Bird • PA / Children’s Hospital of Philadelphia / Drs. James Stevens, Joel Portnoy • TX / Texas Children’s Hospital / Dr. Joan Shook* • TX / Children’s Medical Center of Dallas / Dr. Fiona Levy, Ms. Kathy Lauwers* • WI / Children’s Hospital of Wisconsin / Dr. Matthew Scanlon*

  24. Key findings from NACHRI’s PSI physician case reviews “…while 40% to 50% may seem low for positive predictive value, in terms of real patients, this means that 4 or 5 out of 10 children had a preventable event for this indicator. This is worth looking at and the things we are finding in some instances, will allow for immediate changes that may impact outcomes for future patients.” [Collaborative physician reviewer]

  25. Examples from NACHRI’s PSI physician case reviews • During removal of non functioning port cath the end of the catheter was noted to be "irregular and not smoooth cut". It appeared the tip had been embolized for an unknown duration… • During replacement of pacemaker lead, a fragment of the lead broke off, embolized and ended up lodged (puncture) in the anterolateral papillary muscle. • No notation in original operative note or nursing record that sponge/needle counts were done and correct. • Count was reported as correct. Sponge discovered on xray due to complaints of abdominal pain by patient. • Child with bone tumor who had mandible removed with subsequent bone graft and much packing in wound. This was supposedly removed before extubation, but at the time of extubation a remaining pack blocked her airway causing reintubation with pack removal.

  26. Examples from NACHRI’s PSI physician case reviews • …occurred during the insertion of a PICC line. The record indicates on the first attempt a artery paralleling the basilar vien was cannulated. • Urethral injury after a transurethral ablation of posterior urethral valves as well as bleeding post circumcision… both required suturing to repair. • 14 year old feamle with spinal bifida and urinary and fecal incontinence who underwent appendicovesicostomy, continent cecostomy, bladder neck sling for urinary incontinence and enteroenterostomy. … a small perforation was made in the vagina and was repaired. • 6 week old with pyloric stenosis who underwent laparoscopic pyloromyotomy. A small gastric mucosal perforation occurred at the end and required opening the abdomen to repair. • Colon was perforated during liver transplant. • Pt underwent transrectal drainage of abscess on 7/29 with foley used to drain bladder. Pt developed hematuria and required surgery exploration when bladder puncture was discovered from surgery. • Laparoscopic procedure using harmonic scalpel. Bleeding noted. converted to open procedure. Aorta repaired. • Laparascopic appendectomy. Small opening made in cecum. Repaired at the time of surgery. • After d/c from spinal fusion, patient presented to clinic with drainage from operative site. Seroma noted. Admitted and returned to OR. … lacerated baclofen pump catheter. • Rib removed to use for laryngeal reconstruction--pleura was punctured.

  27. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Test interesting hypotheses (research) • Is it a coding/documentation issue? • Is it a quality issue? • Questions and answers

  28. Effect of work hours reform in NY teaching hospitals on smoothed PSI rates Poulose BK, et al., Ann Surg 2005;241:847-860

  29. Effect of work hours reform in NY teaching hospitals on smoothed PSI ratesPoulose BK, et al., Ann Surg 2005;241:847-860

  30. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Validate or test key hypotheses (research) • Is it a coding/documentation issue? • Is it a quality issue? • Questions and answers

  31. Approaches to assessing construct validity • Is the outcome indicator associated with explicit processes of care (e.g., appropriate use of medications)? • Is the outcome indicator associated with implicit process of care (e.g., global ratings of quality)? • Is the outcome indicator associated with nurse staffing or skill mix, physician skill mix, or other aspects of hospital structure? • Is the outcome indicator associated with other meaningful outcomes of care?

  32. Estimating the impact of preventing each PSI event on mortality, LOS, chargesNIS 2000 analysis by Zhan & Miller, JAMA 2003;290:1868-74

  33. Estimating the impact of preventing each PSI event on mortality, LOS, chargesZhan & Miller, JAMA 2003; key findings replicated by Rosen et al., 2005 * All differences also NS for transfusion reaction and complications of anesthesia in VA/PTF. † Mortality difference NS for foreign body in VA/PTF.

  34. Some PSI rates are significantly higher in African-Americans or Hispanics than in whitesCoffee RM, et al., Med Care 2005;43:I-48 to I-57

  35. Some PSI rates are significantly lower in African-Americans or Hispanics than in whites Coffee RM, et al., Med Care 2005;43:I-48 to I-57

  36. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Test interesting hypotheses (research) • Is it a coding/documentation issue? • Is it a quality issue? • Questions and answers

  37. Coding/documentation issues • There is a basic tension between using administrative data for reimbursement and for defining quality indicators • Submitting bills quickly versus coding from a complete record • Maximizing the coding of complications and comorbidities versus only coding diagnoses “out of the norm.” • Variation in QI rates might be due to variation in: • Data availability (e.g., number of diagnosis codes, admission type, external cause of injury codes) • Documentation completeness and accuracy • ICD-9-CM and DRG coding • Performance (e.g., processes of care, staffing)

  38. ICD-9-CM Coding Adherence to best practices in coding and compliance with coding guidelines will ensure fair reimbursement and accurate measurement of quality indicators • Use the highest possible level of specificity • Avoid overuse of NEC and NOS designation • Follow guidelines re coding of secondary diagnoses • Only codes that impact treatment or complications • Follow guidelines re coding of procedures • Only significant procedures to be reported

  39. ICD-9-CM Coding: Specificity • Highest level of specificity • Avoid overuse of NEC and NOS designation • Examples: • Using 512.8 for “pneumothorax NOS” would exclude a case from the numerator for iatrogenic pneumothorax (512.1)

  40. Coding of secondary diagnoses • For reporting purposes the definition for "other diagnoses" is interpreted as additional conditions that affect patient care in terms of requiring: clinical evaluation; or therapeutic treatment; or diagnostic procedures; or extended length of hospital stay; or increased nursing care and/or monitoring. • UHDDS…defines Other Diagnoses as “all conditions that coexist at the time of admission, that develop subsequently, or that affect the treatment received and/or the length of stay. Diagnoses that relate to an earlier episode which have no bearing on the current hospital stay are to be excluded.”

  41. Coding of secondary diagnoses • “Abnormal findings (laboratory, x-ray, pathologic, and other diagnostic results) are not coded and reported unless the physician indicates their clinical significance.” • “If the findings are outside the normal range and the physician has ordered other tests to evaluate the condition or prescribed treatment, it is appropriate to ask the physician whether the abnormal finding should be added.” • “All conditions that occur following surgery…are not complications… there must be more than a routinely expected condition or occurrence… there must be a cause-and-effect relationship between the care provided and the condition…”

  42. A case study of birth traumaDallas-Fort Worth Hospital Council

  43. Confusion about coding Code Index under “Molding, head” lists 767.3 ICD-9-CM Coding Manual Definition 767.3 Other Injuries To Skeleton Due To Birth Trauma Fracture of: long bones, skull767.4 Injury To Spine And Spinal Cord Due To Birth Trauma{Dislocation} {Fracture} {Laceration} {Rupture} of spine or spinal cord due to birth trauma

  44. ICD-9-CM Coding: Procedures • Coding of procedures “The UHDDS requires all significant procedures to be reported… A significant procedure is defined as one that meets any of the following conditions: Is surgical in nature Carries an anesthetic risk Carries a procedural risk Requires specialized training.” What about central venous catheters?

  45. Examples of ICD-9-CM limitations“Selected infections due to medical care”“Postoperative hemorrhage or hematoma” 999.3 Other infection Infection following infusion, injection, transfusion, or vaccination Sepsis following infusion, injection, transfusion, or vaccination Septicemia following infusion, injection, transfusion, or vaccination Excludes: the listed conditions when specified as: due to implanted device (996.60-996.69) postoperative NOS (998.51-998.59) 998.1 Hemorrhage or hematoma or seroma complicating a procedure Excludes: hemorrhage, hematoma or seroma: complicating cesarean section or puerperal perineal wound (674.3) due to implanted device or graft (996.70-996.79) 998.11 Hemorrhage complicating a procedure 998.12 Hematoma complicating a procedure

  46. Coding Resources • American Health Information Management Association (AHIMA) • www.ahima.org • American Hospital Association • www.hospitalconnect.com/ahacentraloffice/ahaco/index.jsp • National Center for Health Statistics • www.cdc.gov/nchs/icd9.htm • Centers for Medicare and Medicaid Services • www.cms.gov • AHIMA Resources and Practice Briefs • www.ahima.org/infocenter/practice_tools.asp • Developing a Coding Compliance Policy Document • Developing a Physician Query Process • Ongoing Coding Reviews: Ways to Ensure Quality • HIM’s Role in Monitoring Patient Safety • Internet Resources for Coding and Reimbursement Practices

  47. Overview of insights (?) • Why use the PedQIs? • Establish accountability • Surveillance/track performance over time and across hospitals/units/services • Trigger case finding, root cause analyses, identification of clusters • Evaluate impact of interventions • Test interesting hypotheses (research) • Is it a coding issue? • Is it a quality issue? • Questions and answers

  48. Relevance of AHRQ PSIs for Children’s HospitalsSedman A, et al. Pediatrics 2005;115(1):135-145

  49. Linking the PedQIs to quality • New collaboration with NACHRI to conduct chart reviews for PedQIs, focused on confirming the event, describing how it occurred, confirming correct risk stratification, and assessing preventability. • Build collaborative network with other partners, in which UC/Stanford/Battelle will provide: • Standardized, pretested abstraction tools • Abstraction guidelines and resources • Training programs for chart reviewers • Online tools for data collection, management, and cleaning • Summarized data reports for partners with suggestions for improvement (based on data from entire network) • Optional chart over-reading to establish reliability/validity

  50. Goals of collaborative projects • Tier indicators based on validity or potential usefulness for CQI and public reporting; flag indicators that don’t make the grade • Inform NQF review process • Modify indicator definitions if possible to improve sensitivity/specificity • Identify omitted risk factors to improve risk-adjustment • Identify key loci of preventability (opportunities for improvement): what should providers with high rates look for in evaluating their care? What can hospitals learn from the leaders?

More Related