1 / 70

Evaluation This PPT and other resources from: homepage.mac/johnovr/FileSharing2.html

Evaluation This PPT and other resources from: http://homepage.mac.com/johnovr/FileSharing2.html. John Øvretveit, Director of Research, Professor of Health Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden. POINT 1 Evaluation means different things to different people.

ted
Télécharger la présentation

Evaluation This PPT and other resources from: homepage.mac/johnovr/FileSharing2.html

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EvaluationThis PPT and other resources from:http://homepage.mac.com/johnovr/FileSharing2.html John Øvretveit, Director of Research, Professor of Health Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden

  2. POINT 1 Evaluation means different things to different people

  3. Evaluation definition (Øvretveit 1998,2002) • judging the value of something • by gathering information about it • in a systematic way • and by making a comparison, • for the purpose of making a better informed decision.

  4. What I will cover • Who is the evaluation for and their questions? • Three approaches • Their answers to different questions • Their ways of maximising validity • Theory-driven case evaluation • When and how to document and study context

  5. Examples • Standardisable treatment or change to organisation • New chronic care model • Breakthrough collaborative • Joint commission accrediatation

  6. Who is the evaluation for and their questions?User focused evalaution • Who is the main customer for the evaluation and their questions? • Design you evaluation to give the information they need to make the decisions they need to make Vs literature based focus • The evaluation is to fill gaps in scientific knowledge

  7. Key questions for evaluations • Does it work? • (outcomes caused by the intervention) • Would it work here locally? • How did they implement this change? (description) • Which context factors helped and hindered implementation (attribution) • In which range of settings and conditions did it work? (generalisation certainty) • How do I adapt it, or the context, to implement it? (adaption)

  8. Three approaches answer different questions • Controlled experimental • Does it work? • Quaisi experimental • Does it work (less certain) (Easier and less costly) • Theory informed case study evaluation • Does it have effects which may lead to patient outcomes? • How does it work?

  9. 3)RCT Experimental

  10. 3)Experimental intervention: Before after single case

  11. Time series (multiple before/after)

  12. Stepped wedge design

  13. Case study uses model of chain of effects Programme theory ideas about sequence of actions and situation factors leading to intermediate changes and ultimate outcome Cretin et al 2004 model

  14. VHA advanced access – evaluation model by COLMR

  15. . • .

  16. Naturalistic methods PSI study (UK 100k collaborative) 1a

  17. Their ways of maximising validity Internal validity of the evaluation • = how certain are we that the outcomes are due to the intervention and not something else? (attribution) • Experimental: Control for other explanations • Comparison no-intervention patients or providers • Time trends • Case study: causal chain, multiple data

  18. External validity of the evaluation • = with the intervention on other patient or providers, how certain are we that we would find the same outcomes (generalisation) • Experimental: • repeat the evaluation with other targets, or use “representative” targets • Case study: causal chain diagram is a theory which allows decision makers to think through if it would work in their setting from ONE CASE • Analytic generalisation (not statistical)

  19. The importance of context

  20. Can you grow pineapples in Sweden? Seed Gardener/planting & nurture Climate / soil Your change? Change idea Evidence 0-5? + Implementation actions 0-5? + Context 0-5? - Local - Wider

  21. Distinguish

  22. French 2009 review - factors affecting implementation success Categories and measures • Climate:, e.g., openness, respect, trust • OL culture • Vision • Leadership • Knowledge need • Acquisition of new knowledge • Knowledge sharing • Knowledge use

  23. How do you take account of context? • Experimental • You don’t. You get rid of its interference • Should describe intervention, implementation and settings • Case study • Before: Define context factors which might influence implementation - Example in a collaborative? • Collect data on these and assess influence • Build model/theory with context not just causal chain

  24. Summary • Who is the evaluation for and their questions? • Three approaches • Their answers to different questions • Their ways of maximising validity • Theory-driven case evaluation • When and how to document and study context

  25. Your reactions and questions • Any surprises… • Not certain about… • This could be useful…

  26. DETAILS DETAILS

  27. Questions and criteria

  28. .

  29. What I will cover • How evaluation is similar and different to research and monitoring • Challenges all research faces • And how different evaluation designs address these • Naturalistic Non-experimental evaluation • Programme evaluation • Case study evaluation • Realist evaluation • Example of evaluation of HIV/AIDS programme in Zambia

  30. EVALUATION

  31. EvaluationThis PPT and other resources from:http://homepage.mac.com/johnovr/FileSharing2.html John Øvretveit, Director of Research, Professor of Health Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden

  32. All research has these five challenges • Users wants vs evaluators views about what is important • My focus is on user driven research but theory informed – data driven by users decision • Data validity • Are the data we collect valid and reliable? Reducing data bias. Replicablity • Cost of data gathering and its value for the evaluation users • How much extra value for these extra data?

  33. All research has these challenges • Attribution • How do we know the outcomes were due to the intervention and not something else? • Generalisation • To which other patients, organisations or settings do we have confidence that the same findings might be observed?

  34. Three types of evaluation • Experimental controlled - outcome • Compare those getting the intervention with another group • Experimental no controls - outcome • Only look at those getting the intervention before and after • Naturalistic – describe and document different impacts

  35. 3)Experimental intervention: Comparative case

  36. 3)RCT Experimental

  37. Next – experimental no control groupjust before after outcome Single B/A intervention to patients Single B/A intervention to provider

  38. 3)Experimental intervention: Before after single case

  39. 3)Evaluation of Intervention to a service:-Impact on providers

  40. 3)Evaluation of Intervention to a service:Impact on patients

  41. Next – non-experimental process or naturalistic designs • Describe the intervention • Eg a new service for people with chronic disease – multiple components • How the service evolves and why • Some effects (in the pathway towards outcomes) • Eg staff practice and work organisation changes, attitudes

  42. Change chain or Influence Pathway“Programme theory” or “Logic Model” 1)Intervention - training on baby health care ST Result: changes nurse’s knowledge, skills motivation >>>2)Nurses then train mothers MT Result: changes mother’s knowledge, skills motivation >>>3) Mothers then behave differently LT Result: baby health better What was the intervention (three) Which intervention should you evaluate? How What is the outcome of the intervention? (three) Point - find out if each intervention carried out fully, and results

  43. Model Helping Hindering Personnel are given time Shortage of personnel for the education Action >>>> Change 1 >>>> Change 2 (eg education about how (personnel do better Dementia to assess dementia) dementia assessment) onset slower Indicator of this? Indicator of this?

  44. Point Many things we evaluate have change chains One thing changes another then this change changes another thing Not just a drug treatment (one intervention) But many interventions, sometimes in sequence

  45. Case studies Programme theory and concepts for describing change Programme theory ideas about sequence of actions and situation factors leading to intermediate changes and output and outcome changes (their theory, our theory) Cretin et al 2004 model

  46. MRC CSI safety research model (Brown 2008)

  47. . • .

  48. 3)Deductive hypothesis-testing (non-intervention) Revise theory Theory Specific Hypotheses Analyse data Researcher gathers data to test hypotheses, often with a survey Raw data Box is the subject area or sample of people Study start Study finish Time line 2003 2004

  49. For key questions to plan an evaluation • What is the intervention? • Who is the evaluation for? • Which data do they need about the intervention, its effects and the situation? • How do you know the effects are due to the intervention and not something else?

  50. Does your research study an intervention? • What is the intervention? • what are the different implementation actions you (or others) are taking • at different times?

More Related