1 / 28

Module 6 – Evaluation Methods and Techniques

Module 6 – Evaluation Methods and Techniques. Overview of the Module. How the evaluation will be done. Questions and criteria. M e thods and techniques. Qualit y. 1- M e thods, techniques and tools 2- Methods for the evaluation of impacts

vian
Télécharger la présentation

Module 6 – Evaluation Methods and Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Module 6 – Evaluation Methods and Techniques

  2. Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques Quality 1- Methods, techniques and tools 2- Methods for the evaluation of impacts 3- Techniques commonly used for evaluations

  3. Questions and Techniques Each type of evaluation question is associated with specific techniques

  4. What Does an Evaluation Question Include? Each evaluation questionimplies a specific approach which will allow the evaluator to gather the elements that he/she needs to build a line of thought that will allow the evaluator to formulate a sound (or convincing) judgement

  5. Examples of Questions Did the implementation of this program unfold as planned? How can you give a valid answer to this question? 1) I will identify what was planned in the original documents Document Review Interviews with various people in charge to ensure that I have a good understanding of the situation 2) I will reconstruct what happened in reality Document review, review of archive files Interviews with people involved in the different modules and the different phases, and with different sensitivities

  6. Examples of Questions (cont’d) Did the implementation of this program unfold as planned? 3) I will compare the plan with what happened, identify the gaps, and check that the opinions collected on this issue lead to the same conclusions Preparation of a retrospective time chart Organization of the arguments 4) I will decide what constitutes a minor change and a notable change. I will only flag the changes that have had visible consequences on costs and delivery time Interviews with various stakeholders to check the validity of the conclusions reached This is how I propose to proceed to answer this question

  7. Techniques Document review File review One-on-one interviews Discussion groups Tools Interview guides Time charts Examples of Questions (cont’d) Did the implementation of this program unfold as planned? Various techniques must be used in this approach. These techniques call for some specific tools

  8. Second Example Second question, to be selected with the group, preferably on efficiency Follow the reasoning together: • How do you answer the question (consider going all the way back to how the question is worded)? • What process do you follow? • What techniques and tools do you use? • Compare with the previous approach.

  9. The whole evaluation process consists in tools, techniques and a method The term ‘method’ is generally used to designate the process for evaluating impacts

  10. Long- and short-term impacts At the Heart of Evaluation: The Measurement of the Effects The rationale of a program is to produce an effect or an impact Purpose Impact Objective Effects Direct results How do you measure the impact (effects) of a program? This is one of the key methodological issues for evaluations Activities Means

  11. Impact Assessment Methods Solutions must be found for two problems: • To what extent is it possible to identify the effect (revenues increase, the prevalence of a disease goes down, etc.) • To what extent can this effect be attributed to the program (and not to some other cause) To find the best answers possible to these two questions, methods that are specific to evaluation are used

  12. Impact Assessment Methods The evaluator’s key question: What would have happened with the beneficiaries if the program had not existed? How do you rewrite history? How do you get baseline data?

  13. The Only Solution Beyond Doubt: The Ideal Experimentation

  14. Beneficiaries Equivalent control group The Ideal Experimentation With equivalent control group. 30 Effect or impact = 13 Income level 17 In theory, a single observation is not enough 10 Extreme care must be taken when selecting the control group to ensure comparability. Time Program

  15. The Ideal Experimentation With equivalent control group In practice, it is extremely difficult to assemble an exactly comparable control group: • Ethical problem (to condemn a group to not being beneficiaries) • Difficulties associated with finding an equivalent group outside the project • Costs Therefore, this solution is hardly ever used

  16. Beneficiaries Equivalent control group Comparison with a Non-Equivalent Control Group 30 Effect or impact = 13 Income level 21 Establishment of a baseline study 17 14 10 Data on before and after situations is needed. Time Program

  17. Beneficiaries Broad descriptive survey Evaluation without a Comparison Group, Using a Before/After Comparison 30 Effect or impact? 21 Base Line Study Income level 17 14 ? Findings on the impact lack precision and soundness. The time series make it possible to reach better conclusions. 10 Time series Time Program

  18. Beneficiaries Evaluation Using a Simple Post-Implementation Observation Effect or impact ? 30 It is impossible to reach a conclusion regarding the impact; While it is possible to say whether or not the objective has been reached (the effect was achieved), the result cannot be attributed to the program. Revenue level Time Program

  19. Quality - Post - implementation observation Evaluation without + Observatio n a control group before and after implementation ++ Non - equivalent control group Evaluation by comparison with Equivalent control a control group +++ group (true experimentation) Four Broad Categories of Evaluation Methods

  20. - 1- Evaluation Using a Simple Post-Implementation Observation Possibilities Very simple to do. Suitable for the evaluation of means and implementation. Allows evaluators to: (i) ascertain how a policy was implemented; (ii) measure its immediate results (outputs); (iii) gain a better understanding of the behaviours of the groups involved and of the tools or mechanisms. Limitations Difficult to isolate the effects of the policy from the other evolution factors. Huge risk of being subjective. Can hardly be used to evaluate the impact or identify a pattern explaining the phenomena that were observed. Causal relations can be suggested but the findings regarding these links are very fragile and cannot easily be expanded to other situations. Techniques used File review, direct observation, expert opinions, case study, statistical surveys, data analyses, calculation of ratios, comparisons with standards, etc.

  21. + 2- Evaluation Using a Before/After Comparison Possibilities Very common. Corresponds to the natural evaluation process: trying to check that the period over which the policy was implemented coincides with a change in some of the indicators. Same possibilities as `previous method. In addition, makes it possible to have a more refined and quantified description of the effects. Limitations Same as for the previous method. But allows evaluators to use more precise indicators and to frequently cross-check results to test their soundness Techniques used Requires the use of a good description of the baseline situation for all the project results indicators. File review, direct observation, expert opinions, case study, statistical surveys, data analyses, time series analysis, calculation of ratios, comparison with standards, etc.

  22. 3- Evaluation Using a Comparison with a Non-Equivalent Control Group ++ Possibilities Comparison of the policy’s target group with a control group with slightly different characteristics. Allows evaluators to (i) better define the impact or the external results of the policy (without affirming there is a causal relation) and (ii) reveal the mechanisms and behaviours that exist** regarding incentive policies. Limitations Limited relevance for identifying causal relations without any ambiguity. By increasing the number of control groups, it is possible to strengthen the findings. When a group is specifically consulted for the evaluation without being ‘equivalent’,complex statistical techniques sometimes make it possible to isolate the biases associated with the non-equivalence (‘quasi experimentation’) . Techniques used Case study, statistical survey, data analysis, time series analysis, multivariate analysis, modeling.

  23. 4- Evaluation Using a Comparison with an Equivalent Control Group (true experimentation) +++ Possibilities The only totally rigorous evaluation procedure, used among other fields, for therapeutic issues (evaluation of the effects of a medical treatment). It makes it possible to identify without any doubt causal relations and, as a result, the specific effects of a policy or a project. Limitations Numerous feasibility problems are encountered in the field of socio-economical policies when trying to assemble an equivalent control group. This group must be constituted before the project or program is launched following extremely strict rules. Can be unethical. Techniques used Statistical survey, data analysis, time series analysis, multivariate analysis, modeling.

  24. Techniques, Tools, Instruments... Practically all the techniques used in economics and political sciences, especially in statistics, can be used for evaluation. • Interview • Discussion group • Literature search • Archive file review • Questionnaire survey • Case study • Aptitude or knowledge test • Opinion poll • Content analysis • …

  25. Techniques, Tools, Instruments ... One of the necessary qualities for an evaluation tool is that it must be able to identify/imagine the technical processes that will allow it to reach convincing evaluation conclusions for each one of the questions.

  26. Techniques, Tools, Instruments ... It is not possible, in the time available for this workshop, to discuss in detail the techniques, tools and instruments that can be used. It should be noted that evaluators do not necessarily master all these techniques and they often have to call upon specialists. Some of the most commonly used techniques will be presented as part of the exercises.

  27. Techniques, Tools, Instruments … For those of you who want to know more on the techniques, tools and instruments available, there are numerous manuals, guides, etc. on the following topics: • Data collection • Interviews, discussion groups • Direct observations, case studies • Data analysis • ...

  28. Summary of Methodologies

More Related