1 / 19

Evaluation and evaluative research in healthcare: analysis of an example

This article analyzes an example of evaluative research in healthcare, specifically focusing on the effectiveness and efficiency of interventions. It discusses the different types of studies and designs used in evaluative research and examines factors that can threaten internal and external validity. The study example evaluated the impact of educational outreach visits on the prescription patterns of non-steroidal anti-inflammatory drugs in primary care.

sanaa
Télécharger la présentation

Evaluation and evaluative research in healthcare: analysis of an example

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation and evaluative research in healthcare: analysis of an example Enrique Bernal Delgado, PhD Marisol Galeote Mayor, PhD Félix Pradas Arnal, MD Salvador Peiró Moreno, PhD Soledad Márquez Calderón, PhD

  2. Types of studies of interventions and organizations • Descriptive studies • Describe and/or quantify what organizations or interventions, or some aspect of them, are like and/or how they function. • Development studies • Design or plan interventions (practices, processes, programs, policies) or new organizations. • Explanatory studies • Understand how organizations or interventions work, and the factors that influence them. • Evaluation studies • Formulate judgements about interventions, or some aspect of them. • Evaluative research studies • Find out if interventions are effective and/or efficient

  3. Evaluative research • The objective should be to find out if interventions are effective and/or efficient. • Therefore, the designshould define: • the causalmodel behind the hypothesis established. • the variables of the study: dependent, independent y other prognostic variables. • the units of observation. • the instruments used for measurement. • the time frame in which measurement or observation take place • the factors that can affect the study’s internal and external validity

  4. 0 X 0 0 -- 0 R X 0 -- 0 R Designs for evaluative studies EG EG + neqCG EG + eqCG X 0 X 0 -- 0 Post-test 0 X 0 0 X 0 0 -- 0 Pre-test 000 X 000 000 X 000 000 -- 000 Time series 000 X 000 000 -- 000 R

  5. Factors that threaten internal validity • Uncontrolled concomitant prognostic factors • Background: changes in the environment that are simultaneous with the intervention. • Maturation: changes in the individuals included in the program in response to natural developments. • Administration of the TEST: the effect of having administered a PRE-TEST on subsequent POST-TEST. • Instruments: changes in the measurement instruments or in the observers. • Regression to the mean. • Selection biases: the assignation of individuals to the groups may be differential. • Mortality or losses: differential losses of participants.

  6. Factors that threaten external validity • The effect of a reaction to, or an interaction between the tests: the pre-test changes the sensitivity of the participant to the intervention. • Effect of the interaction between a selection bias and the intervention: The response to the intervention is differential in the experimental group. • Effect of reaction to the experiment (Hawthorne): distortion of the effect because the people know they are being studied. • Mortality or losses: differential loss of participants. • Variability: response variability depends on multiple factors. • Generalization of groups or individuals • Generalization from a homogeneous group to a population. • Generalization from the average response of the group to an individual. • Generalization to other places, time frames or programs.

  7. EG EG + neqCG EG + eqCG X 0 X 0 -- 0 Post-test 0 X 0 0 -- 0 R 0 X 0 0 X 0 0 -- 0 Pre-Post-test X 0 -- 0 R 000 X 000 000 X 000 000 -- 000 Time series 000 X 000 000 -- 000 R Designs in evaluative research: control of threats R: random assignation --- : without random assignation 0 : measurement or observation X : Intervention evaluated EG:Experimental group CG: Control group

  8. 0 X 0 0 -- 0 R 000 X 000 000 -- 000 R Pretest-postest with equivalent group Multiple time series with equivalent group Bernal-Delgado E, Galeote-Mayor M, Pradas-Arnal F, Peiró S. Evidence-based educational outreach visits: Effects on prescription of non-steroidal anti-inflammatory drugs. JECH 2002; 56: 653-8.

  9. Summary of the study • Objective: To evaluate the effectiveness of group educational outreach visits based on the systematic review of evidence, to change prescription patterns for drugs in Primary Care. • Population studied: 24 teams with 158 general practitioners in Primary Care in the Healthcare Area of Teruel. • Design: experimental, single blind, with 1 EG and 2 equivalent CG’s.

  10. Causality Model (1): detection of variations in prescriptions of different NSAIDs in small areas Piroxicam 7 Tenoxicam 18 Diclofenac 22

  11. Causality Model ( 2): Hypothesis: the Variations in Medical Practices (VMP) are due to “ignorance or uncertainty” about the advantages and disadvantages of the different NSAIDs prescribed, that can be mitigated with educational strategies. Causality Model for Variations in Medical Practices (VMP)

  12. 0 X 0 0 -- 0 R Are the groups equivalent?

  13. 0 X 0 0 -- 0 R Clinical Problem Search Strategy Analysis of evidence Summary of evidence Evidence-based outreach visit Dissemination of recommendations

  14. 0 X 0 0 -- 0 R Have changes occurred after the interventions?

  15. Some common determinist models (1) y=a exp(bt) (3) y=exp(a+b/t) (2) y=K [1+ a exp (bt)]-1

  16. 000 X 000 000 -- 000 R Have changes occurred after the interventions?

  17. Control of threats to internal validity The findings may be attributable to: • A bias in selection? • Interference from other, simultaneous interventions? • Maturation? • Regression to the mean? • Contamination? • Misclassification of effect and differential losses? • Effect of persons conducting the outreach visits?

  18. Control of threats to external validity Can these results be generalized? • Inclusion criteria were limited. • Was there a reaction to the pre-test? • Was there a reaction to the experiment (Hawthorne) or biases due to the novel effect or social desirability?

  19. CONCLUSIONS The “intervention” was more effective than doing nothing and, although there is no proof that this type of intervention is better than others, this type does seem to produce an incremental effect that surpasses the effect of the placebo. In systems offering few “incentives”, such evidence-based educational sessions should be developed.

More Related