slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
Clinical Audit PowerPoint Presentation
Download Presentation
Clinical Audit

Clinical Audit

730 Vues Download Presentation
Télécharger la présentation

Clinical Audit

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. TMH Clinical Audit Workshop(11th Dec 99)Introduction to Clinical Audit and Audit CycleDr L C Leung, CONS(Surgery)

  2. Clinical Audit • A clinically led initiative which seeks to improve the quality and outcome of patient care through structured peer review whereby clinicians examine their practices and results against agreed explicitstandards and modify their practice where indicated.

  3. Identification of Topic Set Standard Measure Current Service (Audit) Measure Improved Service (Audit) Compare with Standard Implement Improvement Measures

  4. Data Analysis Activities • 1. Data handling • 2. Research • 3. Surgical epidemiology • 4. Outcomes investigation • 5. Clinical audit

  5. Data handling • Example • Develop a computerised database and grading system for Plastic Surgery Outcomes

  6. Data Analysis Activities • 1. Data handling • 2. Research • 3. Surgical epidemiology • 4. Outcomes investigation • 5. Clinical audit

  7. Am I singing the right song Is X as effective as Y X is always more effective than Y To investigate Am I singing this song right Are we doing X, not Y We did X in 75% of cases To improve Research vs Clinical Audit

  8. Research • Example • To investigate whether plain radiographs plus ultrasound is as effective as an IVU in the detection of hydronephrosis.

  9. Data Analysis Activities • 1. Data handling • 2. Research • 3. Surgical epidemiology • 4. Outcomes investigation • 5. Clinical audit

  10. Surgical Epidemiology Activity analysis is not audit Example ‘Audit of paediatric surgery in a DGH’. A retrospective analysis was carried out of activity over 5 years, including types of surgery, deaths, complications and status of operator.

  11. Data Analysis Activities • 1. Data handling • 2. Research • 3. Surgical epidemiology • 4. Outcomes investigation • 5. Clinical audit

  12. Outcomes Investigation • Local validation study • Mortality and morbidity analysis • Patient satisfaction surveys

  13. Local Validation Studies Example ‘Early results of X groin hernia repair done by trainee surgeons in a DGH’. ‘No early recurrences at median follow-up of 7.6 months.’ ‘76% of patients had no complications.’ ‘Median time to cessation of analgesia was 4 days.’

  14. Mortality and Morbidity Example ‘Retrospective analysis of 75 gastrectomies carried out over 17 years by a single-handed surgeon in a cottage hospital.’ Mean age was 63.2 years; Male:female ratio 1.2:1; Overall mortality rate 4%; etc.

  15. Patient satisfaction surveys Example ‘70% of patients replied, and 91% of these were satisfied or very satisfied with the operation. 95% indicated they would be happy to have the same operation again.’

  16. Identification of Topic Set Standard Measure Current Service (Audit) Measure Improved Service (Audit) Compare with Standard Implement Improvement Measures

  17. Step 1: Selection of topic Clinical AuditProcess Dr. M. L Szeto, COS (M&G)

  18. Audit cycle --Select topic Select topic includes : 1. Identify topics 2. Deciding the topic

  19. Suggestions for identifying topics --- Maxwell Dimensions of Quality(1984) 1. Access - to services Location and coverage of service 2. Relevance - to need Of services to the healthcare needs for the population 3. Equity - fairness Fairness in provision for different groups of people

  20. Suggestions for identifying topics --- Maxwell Dimensions of Quality(1984) 4. Efficiency Economy in the use of resources 5. Acceptability To patients and relatives 6. Effectiveness Of care provided

  21. Suggestions for identifying topics -- Donabedian (1966) Structure Amount and type of resources, such as furniture, equipment, staffing, premises, etc. Process Quantity and types of activities of medical care, investigations or procedures Outcome Relevant indicator of clinical care to demonstrate improvement in current or future health

  22. DECIDING THE TOPIC Consider the following factors : • 1. Clinical concern • Wide variation in clinical practice • Major changes recently • High risk • Conditions demand rapid diagnosis or treatment • Achievable results not achieved • Complex or difficult management • Involve other specialties • 2. Financially important • High volume • High cost

  23. DECIDING THE TOPICConsider the following factors : 3. Practical considerations • Measurable activity • Achievable standards • Adequate sample available • Change can be effected • Results worth the effort 4. Group support • Enthusiastic and interesting • Expertise available • Acceptable effort required

  24. 1 2 3 4 5 Rings of evaluation of audit topics Clinical concern Cost Level 5 very high 4 high 3 moderate 2 low 1 very low Group support Feasibility

  25. Set audit parameters Focus of audit - process of patient management - clinical outcome and effectiveness of care Who should be involved - Doctors, nurses, allied health professionals. Multi-disciplinary approach should be adopted. What and how to audit - Regular random review of casenotes and records - Regular review of areas : mortality, infection, clinical incidents, complications - Systematic criterion-based audit

  26. Setting standards Evidence of setting standards can be obtained through : • Literature review • Comparison with other hospitals / countries • Clinical judgement • Assessment of current practice

  27. Types of standards External • Medical literatures • World Health Organisation • Academy / Colleges / National guidelines Internal • Local benchmarks • Modify external standards

  28. Compare external with internal standards

  29. Clinical AuditProcess Step 3 : Design & Measure By Dr. H H YAU, COS(A&E)

  30. 1. Objective(s) Setting • Express what the group intends to • achieve by the audit • Design the audit methodology • to satisfy the audit objective(s)

  31. 2. What to measure? A. Outcome Measures B. Process Measures C. Process-Outcome Measures

  32. 2A. Outcome Measures • May be the most meaningful measures to quality of care. • However, ….. • Important outcomes may happen long after care, • outcomes may be affected by factors that outside the control of the practitioners, • difficult to isolate which services have contributed to the outcomes, & • outcome-only data can be misunderstood by public.

  33. 2B. Process Measures - Process Measures give clearer pictures of care delivery but more time consuming to audit. - More valid esp. when research has demonstrated a direct link with outcomes. - Critical factors or steps can be measured - Causes of poor outcome can be identified during the care process.

  34. 2C. Process-outcome Measures Measure the Process together with the Outcome (It is preferable) Process-outcome and “critical” process measures are more useful to identify areas for improvement. Use up-to-date reference (e.g., protocols, care pathways, and outcome statistics), relevant evidence, collective judgement to develop audit measures.

  35. 3. How to Measure? A. Explicit Measures B. Implicit Measures C. Two-Phase Strategy

  36. 3A. Explicit Measures • Agreed formally by the group as the basis for • data collection. • Criteria are defined objectively & unambiguously • (e.g., no. of re-operations for the same condition • in a defined period) • Representing parameters of good or poor outcome • Can be used by an assistant trained to collect data • (save clinicians time)

  37. 3B. Implicit Measures • Use clinicians knowledge and judgement • Subjective in nature • Individual cases or situations can be analysed. • (e.g., screen out planned re-operate cases) • Audit findings more believable by clinicians

  38. 3C. Two-Phase Strategy Best approach - use explicit and followed by implicit measures. 1. Screen the cases by using explicit measures (can be carried out by clinician’s assistant). 2. Then, conduct structured peer review by using implicit measures to judge whether acceptable care has been given. 3. Identify shortcomings in those problem cases.

  39. 4. Data Collection A. Data Definition - precise, well agreed B. Data Source - medical records, IDS C. Sampling - balance workload & sample size D. Validity - lead to right conclusion E. Reliability - pilot trial (<10% error)

  40. 5. Documentation • Develop an audit tool for the collection of • data (e.g., use of checklists or audit forms) • Record the rating results, marking scheme • and data definition on the forms for • second or future review. • Write an audit report to show the key • components of the audit cycle.

  41. 6. Ethics and Confidentiality With Patient participants - - Voluntary basis - Fully informed in advance Audit data and report must not identify individual patient and staff names. - assign case no. or alphabet to represent patient or staff identities

  42. Clinical Audit Process Step 4 : Evaluate the results (Compare with Standards) Ms Bonnie WONG, M(IA)1

  43. A. Analysing Data (I) • 1. Group Data into useful information • For example • % of non-compliance for • individual group of patients • or staff members, • range of deviations, • types of complications, or • site of incisions.

  44. A. Analysing Data (II) • 2. Analyse the overall pattern or • common trend of the actual practice. • - e.g., which shift of duty happen most, which age group of patients, or which team of staff involved most.

  45. A. Analysing Data (III) • 3. Adjust the preliminary audit findings. • - Take account of special situations, determine “Allowable Exceptions” for clinically-sound reasons through structured peer review.

  46. B. Presenting the Audit Results Structured PeerReview Prelim results Number of cases meeting the audit measures No. of cases were determined clinically acceptable (Allowable exceptions) + % meeting Audit Measures (Standards) = x 100 Total no. of cases reviewed in the audit

  47. C. Document the Exceptions • 1. State the criteriafor giving the allowance • or exceptions, or • 2. Explain the reasons for the cases that were • judged as “exceptions” in the report.

  48. D. Compare with Standards (I) Target Standards: - How far away from the pre-set target? (Outcome Measures) - Are we higher or lower than the reference/ agreed standards? - How much is the gap?

  49. D. Compare with Standards (II) Criterion-based Standards: - What are the % of meeting these criteria? (Process Measures) - Which criteria scored with unsatisfactorily low compliance rate? - What are the potential contributing factors?

  50. E. Identify Causes of shortcomings What are the potential contributing factors for the shortcomings? - Formal Group Discussion - Breakdown problems into organisational or individual staff problems (Deming’s rule 94% Vs 6%) - Identify root-causes lead to the shortcomings.