1 / 11

Monitoring and evaluation after 2013 – some first ideas, mainly for ERDF / CF

This document discusses ideas for monitoring and evaluation after 2013, with a focus on the shift to performance orientation and the development of evaluation culture. It also includes suggestions for more precise rules for monitoring and best practices in evaluation, as well as the exchange of knowledge across Member States facilitated by the Commission. The document explores the possibility of collapsing "results" and "impacts" into "outcomes" and considers changing the terminology for monitoring data. It also addresses the different tasks for evaluation, including supporting programming and strategy development, checking intervention logic, refining outcome indicators, and capturing the effects of interventions through counterfactual and qualitative methods.

rolandv
Télécharger la présentation

Monitoring and evaluation after 2013 – some first ideas, mainly for ERDF / CF

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring and evaluation after 2013 – some first ideas, mainly for ERDF / CF Evaluation network DG REGIO 14th October 2010

  2. Guiding ideas • Shift to performance orientation – development of evaluation culture plus some well targeted obligations • More focus in programm design • More precise rules for monitoring, • Few rules but strive for best methods in evaluation • More exchange of knowledge across MS, facilitated by Commission • No discussion of conditionalities today

  3. Monitoring • Monitoring for each individual intervention done by • Custom indicators – as to date plus • Obligatory core indicators (recommended to date) • No. of projects across the board • Outputs • Definitions • Linked to categories of expenditure • Condition for payments

  4. Monitoring II • Commission could recommend some result indicators („what is going to be changed?“) • Idea: collapse „results“ and „impacts“ into „outcomes“ (or „results“?)?? • Commission to suggest a link between Europe 2020 and selected outcome indicators (in some cases output and input indicators – e.g. 3% for RTD)

  5. Sidestep on terminology • REGIO considers the idea of changing the terminology for monitoring data (observables): Inputs – outputs – results (results including intermediate + final results) or Inputs – outputs – outcomes (outcomes including results and impacts)

  6. Evaluation • Ex ante evaluation MS • On going evaluation MS, Commission • Ex post evaluation Commission, MS • Obligatory evaluation plan? • Summary evaluation by MS towards the end of the period (preparing ex post by Commission)? Mid-term evaluation??

  7. Tasks for Evaluation I • To support the programming • To support the strategy development and optimal split of funding by intervention, • To check the intervention logic, • To support target setting for outputs, • To refine definition of outcome indicators – no targets • To check final target groups • To improve the delivery system • To check methods + data requirements for future evaluations • To spot opportunities for impact evaluations When: ex ante evaluations for all programmes

  8. Tasks for Evaluation II • To support the programme implementation and to refine strategies - according to needs • To revisit selected issues looked at by ex ante evaluation, • To deal with new situations and new questions • By themes, priorities, across programmes… When: On going evaluations as stated in evaluation plans

  9. Tasks for Evaluation III • To capture the Effects / impact of interventions • Counterfactual methods • Integration of counterfactual and qualitative methods • It is new and demanding: no application across the board, best brains needed When: On going and ex post evaluations by MS and Commission

  10. Evaluation methods • Mix of methods: qualitative and quantitative • Reviews, surveys, interviews, case studies, … • Impact evaluation (counterfactual approaches) • Macroeconomic and sectoral models • Project level - cost benefit analysis, … • Guidance by Commission (for REGIO: EVALSED)

  11. Exchange of knowledge • Initiatives organised between MS DG REGIO • Network with MS: informal exchange COM-MS-MS on political and practical issues • EVALSED: guidance on methods • Library of evaluations done in MS • As part of EVALSED • Building up a quality review • Training, e.g. summer school 2011 • Biannual conferences

More Related