1 / 117

Implementation of quality indicators: barriers and facilitators

Implementation of quality indicators: barriers and facilitators. Peter HJ van der Voort, MD, PhD, MSc Dept of intensive care Onze Lieve Vrouwe Gasthuis Amsterdam, The Netherlands. Content. Indicators and quality improvement The Dutch Intensive Care Registry Barriers to implementation of QI

saber
Télécharger la présentation

Implementation of quality indicators: barriers and facilitators

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementation of quality indicators: barriers and facilitators Peter HJ van der Voort, MD, PhD, MSc Dept of intensive care Onze Lieve Vrouwe Gasthuis Amsterdam, The Netherlands

  2. Content • Indicators and quality improvement • The Dutch Intensive Care Registry • Barriers to implementation of QI • Facilitators • Implementation strategies • From indicator to improvement of care • InFoQi study with the Dutch indicator set

  3. Dutch National Intensive Care Registry (NICE)

  4. Benchmark of outcome • SMR • length of stay

  5. Benchmark of outcome • SMR • length of stay • How to improve?

  6. Dutch Society of Intensive Care • 2003-2004 development of an indicator set to analyse quality of care • Based on quality domains of IOM

  7. National Guideline On Intensive Care Organisation • 1993, revised 2005, implemented 2006 • Section on quality improvement: • Quality Indicator set NVIC incorporated in guideline • No indicators to follow implementation of the guideline

  8. How to use Quality Indicators 1 As a tool to measure the implementation of a guideline * Indicators

  9. E.g. 2006 Guideline on organisation advises to have regional partners for collaboration (volume – outcome relation). To discuss individual patients. • Indicator: % patients discussed from total admitted

  10. How to use Quality Indicators 2 As a part of Total Quality Management

  11. The Dutch Quality Indicator Set • Cooperation between NICE and NVIC • Set developed by NVIC • Benchmark by NICE • Pilot: registration workload, definitions • Active promotion to Dutch ICU’s

  12. Implementation of the indicator set appeared to be a QI plan by itself

  13. Implementation of the indicator set appeared to be a QI plan by itself • Problems • Create the sense of urgency to use indicators • How to implement the registration of indicators • How to feed-back • How to make conclusions • How to implement changes

  14. Quality Domains Definition of quality indicators Adoption Diffusion Dissemination AIRE

  15. Purpose, relevance and organisation where the indicator appoints to • Involvement of professionals • Scientific evidence • Additional reasoning and use

  16. Quality Domains Definition of quality indicators AIRE PDSA

  17. Avedis Donabedian 1919-2000 • Structure: • Organisation, resources and equipment • Process: • Process of care between caregiver and patient • Outcome • Results (at patient level)

  18. Dutch indicators - internal • Availability of intensivists • Nurse to patient ratio • Policy to prevent medication errors • Registration of patient- and family satisfaction Structure Process • Length of stay in the ICU • Duration of ventilation • Number of days with 100% beds occupied • Glucose regulation Outcome • Mortality • Incidence of pressure sores • Unplanned extubation

  19. Quality Domains Definition of quality indicators AIRE PDSA

  20. Decision to use a set of indicators • Organize and implement registration of data • Data validity • Data export • Analysis and benchmark (NICE) • Feedback • Interpretation and conclusion • Plan for change • Implementation of changes / new methods

  21. Barriers on all levels

  22. Decision to use a set of indicators • Sense of urgency • Intrinsic motivation to improve • Legislation • Hospital directors • Society of Intensive Care • Convince that using indicators improve care What we did: Inform, offer tools

  23. Decision to use a set of indicators • Organize and implement registration of data • Data validity • Data export • Analysis and benchmark (NICE) • Feedback • Interpretation and conclusion • Plan for change • Implementation of changes / new methods

  24. Intervention study: InFoQi to improve QI using indicators • 3 interventions: • Extensive feedback • QI team • Educational Outreach

  25. To develop an optimal intervention • Literature search on optimal feedback • Search for barriers in literature, expert groups, questionnaire

  26. Quality Domains Definition of quality indicators AIRE PDSA

  27. Implementation strategiesusing indicator data • Educational meeting • Educational outreach • Audit and feedback • Development of a quality improvement plan • Financial incentives • Supporting activities: • Distribution of educational material • Use of a local opinion leader • Development of a quality improvement team

  28. Educational meeting • Participation in conferences, seminars, lectures, workshops or training sessions. • During these meetings, feedback of quality indicators is presented, and study participants discuss how to improve performance.

  29. Educational outreach • A trained independent person or investigator meets with health professionals or managers in their practice setting to provide information (e.g. feedback of quality indicators).

  30. Development of a quality improvement plan • A plan based on indicator data to be used to improve the quality of care.

  31. Financial incentives • Rewarding individual health professionals or institutions with higher payments when they improve performance.

  32. Audit and feedback • Giving a report, including a summary of clinical performance over a specified period of time.

  33. “any summary of clinical performance over a specified period of time”

  34. “It is striking how little can be discerned about the effects of audit and feedback based on the 118 trials included in this review.” “Low baseline compliance with recommended practice and higher intensity of audit and feedback were associated with larger adjusted risk ratios (greater effectiveness) across studies.”

  35. Feedback of analysed data

  36. Jamtvedt et al. reviewed information feedback based on any health care data source, while we focused on feedback based on data from medical registries • we aimed to include not only RCTs, but any peer-reviewed paper on information feedback within the context of a medical registry. Furthermore, where Jamtvedt et al. only reported on the effectiveness of information feedback, we also aimed to identify the barriers and success factors to this effectiveness as reported in the literature

  37. 53 papers; 50 feedback initiatives • 24 analytic papers for 22 studies evaluating the effect of a feedback method on one (n=8) or more (n = 14) primary, clinical outcome measures • Positive effect on all outcome measures: 4 • Mix of positive and no effects: 8 • 10 not any effect. • None of the 22 studies reported a negative effect.

  38. MFA = multifaceted approach

  39. “To review the literature concerning strategies for implementing quality indicators in hospital care, and their effectiveness in improving the quality of care”

  40. 21 studies (9 RCT, 2 CCT and 10 B-A) • 17 US; 14 cardiovascular care • 1-379 participating hospitals • Indicators and hospital care • 20 on process care • 6 on patient outcomes • Follow up 6 months de Vos et al. Int J Qual Health Care 2008;1-11

  41. de Vos et al. Int J Qual Health Care 2008;1-11

  42. de Vos et al. Int J Qual Health Care 2008;1-11 • Study design unrelated to effectiveness • Results on outcomes: • 4 studies indicators ineffective • 1 partially effective • 1 effective Effective: > 50% sign improvement; partially effective: +/- 50% improvement; ineffective: <50% improvement

  43. Results on process: • 20 studies • 3 no significant improvement at all • 8 improvement in some • 7 partially effective • 2 significant improvement in all process indicators Most are effective on process of care de Vos et al. Int J Qual Health Care 2008;1-11

  44. de Vos et al. Int J Qual Health Care 2008;1-11

More Related