1 / 52

Clinical Audit

NTWC Clinical Audit Workshop (19 th May 2004) Introduction to Clinical Audit and Audit Cycle Dr L C Leung, Chairman of Central Clinical Audit Committee. Clinical Audit.

weylin
Télécharger la présentation

Clinical Audit

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NTWC Clinical Audit Workshop(19th May 2004)Introduction to Clinical Audit and Audit CycleDr L C Leung, Chairman of Central Clinical Audit Committee

  2. Clinical Audit • A clinically led initiative which seeks to improve the quality and outcome of patient care through structured peer review whereby clinicians examine their practices and results against agreed explicitstandards and modify their practice where indicated.

  3. Identification of Topic Set Standard Measure Current Service (Audit) Measure Improved Service (Audit) Compare with Standard Implement Improvement Measures

  4. Data Analysis Activities • 1. Data handling • 2. Research • 3. Surgical epidemiology • 4. Outcomes investigation • 5. Clinical audit

  5. Data handling • Example • Develop a computerised database and grading system for Plastic Surgery Outcomes

  6. Am I singing the right song Is X as effective as Y X is always more effective than Y To investigate Am I singing this song right Are we doing X, not Y We did X in 75% of cases To improve Research vs Clinical Audit

  7. Research • Example • To investigate whether plain radiographs plus ultrasound is as effective as an IVU in the detection of hydronephrosis.

  8. Surgical Epidemiology Activity analysis is not audit Example ‘Audit of paediatric surgery in a DGH’. A retrospective analysis was carried out of activity over 5 years, including types of surgery, deaths, complications and status of operator.

  9. Outcomes Investigation • Local validation study • Mortality and morbidity analysis • Patient satisfaction surveys

  10. Local Validation Studies Example ‘Early results of X groin hernia repair done by trainee surgeons in a DGH’. ‘No early recurrences at median follow-up of 7.6 months.’ ‘76% of patients had no complications.’ ‘Median time to cessation of analgesia was 4 days.’

  11. Mortality and Morbidity Example ‘Retrospective analysis of 75 gastrectomies carried out over 17 years by a single-handed surgeon in a cottage hospital.’ Mean age was 63.2 years; Male:female ratio 1.2:1; Overall mortality rate 4%; etc.

  12. Patient satisfaction surveys Example ‘70% of patients replied, and 91% of these were satisfied or very satisfied with the operation. 95% indicated they would be happy to have the same operation again.’

  13. Step 1: Selection of topic Clinical AuditProcess Dr. S K O, COS (Cl. Oncology)

  14. Audit cycle --Select topic Select topic includes : 1. Identify topics 2. Deciding the topic

  15. Suggestions for identifying topics --- Maxwell Dimensions of Quality(1984) 1. Access - to services Location and coverage of service 2. Relevance - to need Of services to the healthcare needs for the population 3. Equity - fairness Fairness in provision for different groups of people

  16. Suggestions for identifying topics --- Maxwell Dimensions of Quality (1984) 4. Efficiency Economy in the use of resources 5. Acceptability To patients and relatives 6. Effectiveness Of care provided

  17. Suggestions for identifying topics -- Donabedian (1966) Structure Amount and type of resources, such as furniture, equipment, staffing, premises, etc. Process Quantity and types of activities of medical care, investigations or procedures Outcome Relevant indicator of clinical care to demonstrate improvement in current or future health

  18. DECIDING THE TOPIC Consider the following factors : • 1. Clinical concern • Wide variation in clinical practice • Major changes recently • High risk • Conditions demand rapid diagnosis or treatment • Achievable results not achieved • Complex or difficult management • Involve other specialties • 2. Financially important • High volume • High cost

  19. DECIDING THE TOPICConsider the following factors : 3. Practical considerations • Measurable activity • Achievable standards • Adequate sample available • Change can be effected • Results worth the effort 4. Group support • Enthusiastic and interesting • Expertise available • Acceptable effort required

  20. 1 2 3 4 5 Rings of evaluation of audit topics Clinical concern Cost Level 5 very high 4 high 3 moderate 2 low 1 very low Group support Feasibility

  21. Set audit parameters Focus of audit - process of patient management - clinical outcome and effectiveness of care Who should be involved - Doctors, nurses, allied health professionals. Multi-disciplinary approach should be adopted. What and how to audit - Regular random review of case notes and records - Regular review of areas : mortality, infection, clinical incidents, complications - Systematic criterion-based audit

  22. Setting standards Evidence of setting standards can be obtained through : • Literature review • Comparison with other hospitals / countries • Clinical judgement • Assessment of current practice

  23. Types of standards External • Medical literatures • World Health Organisation • Academy / Colleges / National guidelines Internal • Local benchmarks • Modify external standards

  24. Compare external with internal standards

  25. Clinical AuditProcess Step 3 : Design & Measure By Dr. H H YAU, Con (A&E)

  26. 1. Objective(s) Setting • Express what the group intends to • achieve by the audit • Design the audit methodology • to satisfy the audit objective(s)

  27. 2. What to measure? A. Outcome Measures B. Process Measures C. Process-Outcome Measures

  28. 2A. Outcome Measures • May be the most meaningful measures to quality of care. • However, ….. • Important outcomes may happen long after care, • outcomes may be affected by factors that outside the control of the practitioners, • difficult to isolate which services have contributed to the outcomes, & • outcome-only data can be misunderstood by public.

  29. 2B. Process Measures - Process Measures give clearer pictures of care delivery but more time consuming to audit. - More valid esp. when research has demonstrated a direct link with outcomes. - Critical factors or steps can be measured - Causes of poor outcome can be identified during the care process.

  30. 2C. Process-outcome Measures Measure the Process together with the Outcome (It is preferable) Process-outcome and “critical” process measures are more useful to identify areas for improvement. Use up-to-date reference (e.g., protocols, care pathways, and outcome statistics), relevant evidence, collective judgement to develop audit measures.

  31. 3. How to Measure? A. Explicit Measures B. Implicit Measures C. Two-Phase Strategy

  32. 3A. Explicit Measures • Agreed formally by the group as the basis for • data collection. • Criteria are defined objectively & unambiguously • (e.g., no. of re-operations for the same condition • in a defined period) • Representing parameters of good or poor outcome • Can be used by an assistant trained to collect data • (save clinicians time)

  33. 3B. Implicit Measures • Use clinicians knowledge and judgement • Subjective in nature • Individual cases or situations can be analysed. • (e.g., screen out planned re-operate cases) • Audit findings more believable by clinicians

  34. 3C. Two-Phase Strategy Best approach - use explicit and followed by implicit measures. 1. Screen the cases by using explicit measures (can be carried out by clinician’s assistant). 2. Then, conduct structured peer review by using implicit measures to judge whether acceptable care has been given. 3. Identify shortcomings in those problem cases.

  35. 4. Data Collection A. Data Definition - precise, well agreed B. Data Source - medical records, IDS C. Sampling - balance workload & sample size D. Validity - lead to right conclusion E. Reliability - pilot trial (<10% error)

  36. 5. Documentation • Develop an audit tool for the collection of • data (e.g., use of checklists or audit forms) • Record the rating results, marking scheme • and data definition on the forms for • second or future review. • Write an audit report to show the key • components of the audit cycle.

  37. 6. Ethics and Confidentiality With Patient participants - - Voluntary basis - Fully informed in advance Audit data and report must not identify individual patient and staff names. - assign case no. or alphabet to represent patient or staff identities

  38. Clinical Audit Process Step 4 : Evaluate the results (Compare with Standards) by Ms Bonnie WONG, CM(Q&RM)

  39. A. Analysing Data (I) • 1. Group Data into useful information • For example • % of non-compliance for • individual group of patients • or staff members, • range of deviations, • types of complications, or • site of incisions.

  40. A. Analysing Data (II) • 2. Analyse the overall pattern or • common trend of the actual practice. • - e.g., which shift of duty happen most, which age group of patients, or which team of staff involved most.

  41. A. Analysing Data (III) • 3. Adjust the preliminary audit findings. • - Take account of special situations, determine “Allowable Exceptions” for clinically-sound reasons through structured peer review.

  42. B. Presenting the Audit Results Structured PeerReview Prelim results Number of cases meeting the audit measures No. of cases were determined clinically acceptable (Allowable exceptions) + % meeting Audit Measures (Standards) = x 100 Total no. of cases reviewed in the audit

  43. C. Document the Exceptions • 1. State the criteria for giving the allowance • or exceptions, or • 2. Explain the reasons for the cases that were • judged as “exceptions” in the report.

  44. D. Compare with Standards (I) Target Standards: - How far away from the pre-set target? (Outcome Measures) - Are we higher or lower than the reference/ agreed standards? - How much is the gap?

  45. D. Compare with Standards (II) Criterion-based Standards: - What are the % of meeting these criteria? (Process Measures) - Which criteria scored with unsatisfactorily low compliance rate? - What are the potential contributing factors?

  46. E. Identify Causes of shortcomings What are the potential contributing factors for the shortcomings? - Formal Group Discussion - Breakdown problems into organisational or individual staff problems (Deming’s rule 94% Vs 6%) - Identify root-causes lead to the shortcomings.

  47. F. Examples of systemic problems • Unclear direction, objectives or working guidelines • Poor communication between staff & supervisors • Lack of monitoring & feedback mechanism • Insufficient staff supervision • Insufficient training & development • Inaccessible to updated information • Ineffective work arrangement or time scheduling • Inappropriate staff or skill mix • Inappropriate tools, equipment or facilities • Lack of incentives or encouragement

  48. G. Identify Needed Improvements • Based on the analysis of the root-causes of problems, • develop strategies for improvements. • Next is action!

  49. Step 5 Re-audit • Repeat the audit process for improvement • re-audit as frequently as required • as quickly as possible • until needed improvements are achieved and sustained • Repeat audit cycle several times till changes made are working as intended .

  50. Step 6 Compare improved service with standard Ongoing audit cycle line standard change change change %of success time

More Related