1 / 40

The Scottish Patient Safety Programme

The Scottish Patient Safety Programme. Using Data at the Front-line and Across the System Terri Simmonds Jane Murkin Lindsay Martin. Model for Improvement. Using Data to understand progress toward the team’s aim.

kristian
Télécharger la présentation

The Scottish Patient Safety Programme

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Scottish Patient Safety Programme Using Data at the Front-line and Across the System Terri Simmonds Jane Murkin Lindsay Martin

  2. Model for Improvement Using Data to understand progress toward the team’s aim Using Data to answer the questions posed on in the plan for each PDSA cycle The Improvement Guide, API

  3. Need for Measurement • Improvement is not about measurement. • But measurement plays an important role: • Key measures are required to assess progress on team’s aim • Specific measures can be used for learning during PDSA cycles • Balancing measures are needed to assess whether the system as a whole is being improved • Data from the system (including from patients and staff) can be used to focus improvement and refine changes

  4. Reaction to Data Stages of Facing Reality • “The data are wrong” • “The data are right, but it’s not a problem” • “The data are right; it is a problem; but it is not my problem.” • “I accept the burden of improvement”

  5. “The Three Faces of Performance Measurement: Improvement, Accountability and Research” “We are increasingly realizing not only how critical measurement is to the quality improvement we seek but also how counterproductive it can be to mix measurement for accountability or research with measurement for improvement.” Lief Solberg, Gordon Mosser and Sharon McDonaldJournal on Quality Improvement vol. 23, no. 3, (March 1997), 135-147.

  6. The Three Faces of Performance Measurement

  7. Three Types of Measures Outcome Measures: Voice of the customer or patient. How is the system performing? What is the result? Process Measures: Voice of the workings of the system. Are the parts/steps in the system performing as planned? Balancing Measures: Looking at a system from different directions/dimensions. What happened to the system as we improved the outcome and process measures? (e.g. unanticipated consequences, other factors influencing outcome)

  8. Measurement Guidelines • A few key measures that clarify a team’s aim and make it tangible should be reported, and studied by the team, each month • Be careful about over-doing process measures for monthly reports • Make use of available data bases to develop the measures • Integrate data collection for measures into the daily routine • Plot data on the key measures each month during the life of the project

  9. Measurement Guidelines • The question - How will we know that a change is animprovement? - usually requires more than one measure • A balanced set of five to eight measures will ensure that the system is improved • Balancing measures are needed to assess whether the system as a whole is being improved

  10. SPSP High Level Outcome Measures • Standardised Mortality Ratio (if available) • Percent unadjusted mortality • Adverse event rate (using the Global Trigger Tool) • SPSP Key Measures: Leadership • Process Measures • Number of WalkRounds completed • Percent of actionable items identified during WalkRounds that are completed

  11. SPSP Key Measures: Critical Care Outcome Measures • VAP rate • CLC bloodstream infections or Days between CLC bloodstream infections • Staph. aureusBacteraemias (SABs) rate or Days between SABs • C. difficile associated disease rate or Days between C. diff associated occurrences • Percent ICU and HDU mortality • Percent of ICU and HDU blood sugar results within range (3.5 – 8.5 mmol/L) Process Measures • ALOS on mechanical ventilation • Percent compliance with preventing VAP care bundle • Percent compliance with central line bundle • Percent compliance with hand hygiene • Percent achievement of multi-disciplinary rounds and daily goals • Re-intubation rate Balancing Measure ICU ALOS

  12. SPSP Key Measures: General Ward Outcome Measures • Crash call rate • Staph. aureus Bacteraemias (SABs) rate or Days between SABs • C. difficile associated disease rate or Days between C. difficile associated disease occurrences Process Measures • Percent compliance with Early Warning Score Assessment • The percent of patients for which a respiratory rate is recorded each time • observation occurs • Percent of patients identified as at risk have appropriate interventions • undertaken in terms of their management as categorised by the early • warning score. • Number of calls to the outreach team • Percent compliance with hand hygiene • Percent compliance with using safety briefings • Percent compliance with using SBAR

  13. Measurement and Data Collection During PDSA Cycles • Collect useful data, not perfect data - the purpose of the data is learning, not evaluation • Use a pencil and paper until the information system is ready • Use sampling as part of the plan to collect the data to reduce workload • Use qualitative data (feedback) rather than wait for quantitative data • Record what went wrong during the data collection

  14. Overall Project Measures vs. PDSA Cycle Measures Data for Project Measures: - Overall results related to the project aim (outcome, process, and balancing measures) for the life of the project Achieving Aim Data for PDSA Measures: - Quantitative data on the impact of a particular change - Qualitative data to help refine the change - Subsets or stratification of project measures for particular patients or providers - Collect only during cycles Adapting Changes During PDSA Cycles

  15. Expectations for Improvement When will my data start to move? • Process measures will start to move first. • Outcome measures will most likely lag behind process measures. • Balancing measures – just monitoring – not looking for movement (pay attention if there is movement).

  16. Integrate Data Collection for Measures in Daily Work • Include the collection of data with another current work activity (for example, pain scores with other vital signs; data from office visit flowsheets) • Develop an easy-to-use data collection form or make Information Systems input and output easy for clinicians  • Clearly define roles and responsibilities for on going data collection • Set aside time to review data with all those that collect it  

  17. Data Collection Exercise • PDSA Measurement • Your team is planning a PDSA cycle to test the impact of posters on the unit to remind staff to wash hands. The test will last one week on the unit. • Design a data collection plan to evaluate the impact of the change during the PDSA. • Consider who, what, when, where, and design the data collection form. • 2. Operations Measurement • Your unit wants to make compliance with hand washing an ongoing measure that will be reported in the unit family of measures each week. • Design an ongoing data collection and measurement plan and describe a plan to implement the plan. • Make any necessary assumptions about number of staff, etc.

  18. Minimum Standard for Reporting Project Measures: Annotated Time Series

  19. Let the Data tell the story- Annotations

  20. Unit 1 Unit 2 Unit 3 Cycle time results for units 1, 2 and 3 Unit 2

  21. Presenting your data with Time Series

  22. Evaluating Evidence from Run Charts • Let the graph speak for itself • Focus first on practical significance, then statistical significance if necessary • The chart summarises a longitudinal study – changes tested and implemented over time • Consider the changes as a system, not as discrete interventions • Sample size may be important, especially with high reliability processes and rare events • Sometimes useful to develop Shewhart control limits to guide interpretation

  23. Look at the Relationships Calls to Outreach per Month Crash Call Rate per 1000 Discharges

  24. Look at the Relationships

  25. “Controlling Variation in Health Care: A Consultation with Walter Shewhart” “Unintended Variation” Not purposeful, planned, or guided t r i V a a o i n “Intended Variation” Purposeful, planned, guided, or considered From Don Berwick, 1991

  26. Tools to Learn from Variation in Data Run Chart Shewhart Chart Scatterplot Frequency Plot Pareto Chart

  27. Elements of a Run Chart The centerline (CL) on a Run Chart is the Median ~ Measure X (CL) Four run tests can be used to determine if non-random patterns are present Time

  28. Rule 1 • Sixor more consecutivePOINTS either all above or all below the median. Skip values on the medianand continue counting points. Values on the median DO NOT make or break a shift.

  29. Median=11 Rule 2 Five pointsall going up or all going down. If the value of two or more successive points is the same, ignore one of the points when counting; like values do not make or break a trend. Median 11

  30. Data line crosses once Too few runs: total 2 runs Median 11.4 Rule 3 To Determine The Number of Runs Above and Below the Median: • A run is a series of points in a row on one side of the median. Some points fall right on the median, which makes it hard to decide which run these points belong to. • So, an easy way to determine the number of runs is tocount the number of times the data line crosses the median and add one. • Statistically significant change signaled by too few or too many runs. Data line crosses once Too few runs: total 2 runs

  31. Run Chart: Medical Waste How many runs are on this chart? Points on the Median (don’t count these when counting the number of runs)

  32. How many runs are on this chart? 14 runs Points on the Median (don’t count these when counting the number of runs)

  33. Test 3- too few or too many runs Use this table by first calculating the number of "useful observations" in your data set. This is done by subtracting the number of data points on the median from the total number of data points. Then, find this number in the first column. The lower limit for the number of runs is found in the second column. The upper number of runs can be found in the third column. If the number of runs in your data falls below the lower limit or above the upper limit then this is a signal of a special cause. # of Useful Lower Number Upper Number Observations of Runs of Runs 15 4 12 16 5 12 17 5 13 18 6 13 19 6 14 20 6 15 21 7 15 22 7 16 23 8 16 24 8 17 25 9 17 26 9 18 27 9 19 28 10 19 29 10 20 30 11 20 Total useful observations Total data points

  34. Rule 4: Astronomical Value For detecting unusually large or small numbers: • Data that is Blatantly Obviousdifferent value • Everyone studying the chart agrees that it is unusual • Remember: • Every data set will have a high and a low - this does not mean the high or low are astronomical

  35. Non-Random Rules for Run Charts A Shift: 6 or more A Trend 5 or more Too many or too few runs An astronomical data point Source: The Data Guide by L. Provost and S. Murray, Austin, Texas, February, 2007: p3-10.

  36. Exercise: Apply the Run Chart Rules Source: Page and Washburn. “Tracking Data to Find Complications that Physicians Miss” Joint Commission Journal on Quality Improvement. October 1997, p. 153.

  37. Elements of a Shewhart Chart An indication of a special cause UCL (Upper Control Limit) X (CL) Measure LCL (Lower Control Limit) Time

  38. Shewhart Chart: Psychiatric Re-admissions Within 15 Days of Discharge UCL UCL LCL LCL

  39. Using Shewhart Chart to Guide Improvement Select a key measure Develop an appropriate Shewhart chart for this measure Change the system (remove common causes) Remove special causes Is the system Stable for this measure? No Yes Identify common causes Identify special causes Stable Process 1. Planned experimentation 2. Rational Subgrouping 1. Technical experts 2. Supervisors 3. Workers in the system 1. Management 2. Technical experts Unstable Process 1. Shewhart charts 2. Rational Subgrouping 3. Planned experimentation 1. Workers in the system 2. Supervisors 3. Technical experts 1. Local Supervison 2. Technical experts 3. Management 4. Workers in the system Tools/Methods Responsibilities for identification Responsibilities for improvement (Note: Methods and responsibility are ordered by importance.)

  40. Model for Improvement Using Data to understand progress toward the team’s aim Using Data to answer the questions posed on in the plan for each PDSA cycle The Improvement Guide, API

More Related