1 / 16

V4 + Croatia & Slovenia Expert level conference

V4 + Croatia & Slovenia Expert level conference. The Closing event of the Hungarian V4 presidency. the process of setting up the performance framework in hungary. Szilvia HAJDU Head of evaluation and analysis unit Prime minister’s Office. Experiences 2007-2013.

Télécharger la présentation

V4 + Croatia & Slovenia Expert level conference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. V4 + Croatia & Slovenia Expert level conference The Closing event of the Hungarian V4 presidency

  2. the process of setting up the performance framework in hungary Szilvia HAJDU Head of evaluation and analysis unit Prime minister’s Office

  3. Experiences 2007-2013 • Too many indicators (~ 6000 pcs in the IT system) • Aggregation of data was not always possible • All indicators, even impact indicators collected from beneficiaries • „Performance framework” already introduced at the level of beneficiaries (consequences: retrospective amendments of contracts due to non-achievement of targets  administrative burden)

  4. New requirements for 2014-2020 • Obligatory use of common indicators • Financial consequences linked to physical indicators • Implications: • The previous indicator routine won’t work in 2014-2020  need for enhanced central coordination • Common indicators are not applicable for proper monitoring of progress  need for external result indicators • Consequences for the indicator programming: • No1: Re-conception of the whole indicator system • No2: Very simple indicator system in the OPs

  5. Programmingindicatorswas more time-consumingthanexpected… • Very slow acceptance of the new intervention logic  „impact indicators” are still favourites despite their non-measurability • Ministries wanted to introduce a lot indicators in the OPs (remember DG MOVE, DG ENVI, EU PARLIAMENT etc. in the case of common indicators) • Several workshops held by the Central M&E Dept. with MAs and other responsible bodies between Summer 2013 and Spring 2014 • February 2014: common workshop with COM on indicators • Conceptual questions had to be discussed  • time consuming process

  6. Programming of indicators: consultations COMdesk officers on intervention logic & programme-specificindicators Ministries (OP designers) National Ministry of Economics (OP programmingcoordination) Managing authorities (Ministries) COMhorizontal unitsoncommonindicators & performance framework Central M&E Department (PrimeMinister’s Office)

  7. Programming principles • Programming principles of indicators as agreed: • Output indicators to be chosen from common indicators • Performance framework indicators to be chosen from output (and therefore common) indicators • Performance framework covered by only one output indicator (if possible) • No use of result indicators in the performance framework • Result indicators to be chosen only from already available databases

  8. TechnicalChallenges • ERDF • Common indicators are not always output indicators (eg. CO2 gas emissions) • Additional programme-specific indicators had to be defined in order to put them into the performance framework [administrative burden] • ESF • Target setting for output indicators became very complicated (by age, by education, by employment status etc.)  overlapping values • Some common indicators had to be merged into a new programme-specific indicator in order to put the latter one into the performance framework [approval by COM pending]

  9. Technical ChallEnges • Should only one indicator be chosen covering 50% or more indicators? • For some priority axes responsibilities are shared among more ministries (eg. Environment and Energy OP) • Some priority axes are very heterogeneous (eg. Regional OP) • Choosing only one indicator may „punish„ other theme fields in the case of non-achievement • However, choosing more indicators may be complicated to monitor and risks of non-achievement are higher

  10. TechnicalchallEnges: targetsetting • A general and standardized methodology couldn’t be applied because of the differences in the programming exercise • However, the main target setting methodology was based on historical data from the 2007-2013 period • In the case of the Transport OP, the targets were taken over from the National Transport Strategy which covers all transport projects to be implemented in the 2014-2020 period • In the case of infrastructural developments, the amount and the corresponding indicator values of phased projects could not be taken into consideration yet

  11. Technical chAllEnges: SFC • There is a requirement of splitting the target values of ESF indicators by men and women • Application of the performance reserve by year  whether it has any implications for the commitment targets or is it just a technical exercise

  12. Summaryfor ERDF • The system introduced in the context of the performance framework is based on the common indicators (to ensure coherency and a limited number of indicators) • However, common indicators are not always applicable for measuring real progress (eg. in the case of SME support the common ERDF indicator – number of SMEs receiving support – is not even an indicator….) • A sourcebook of ERDF indicators has to be defined – these indicators won’t be included in the OPs, however it is crucial for the monitoring of the projects’ progress • Enhanced role of evaluations

  13. Summaryfor ESF • The system introduced in the context of the performance framework is based on the common indicators (to ensure coherency and a limited number of indicators) • ESF common indicators are OK (because the interventions are homogeneous) • A proper data collecting system has to be defined (1st step: drafting a survey which will be collected from the participants from the very beginning through the whole programming period) • However, ESF common indicators don’t say anything about the efficiency: which type of intervention works better in different target groups  enhanced role of evaluations

  14. NEXT STEPS asdefinedinthepartnershipagreement • Complimentary documentation on the performance framework hasn’t been submitted yet  consistency check is inprogress • Whether similar indicators have similar unit costs – especially in the case of ESF • Whether the sum of the milestones for the financial indicators cover exactly the N+3 commitment targets for 2018 (by Fund and category of regions) • Whether the 50% representativity rule is OK • Monitoring of phased projects and review of the performance frameworks by the corresponding indicator values

  15. NEXT STEPS for a proper monitoring • Consistency check with EAFRD and EMFF • Setting up an indicator working group at the level of the Partnership Agreement • Defining the national sourcebook of indicators for ERDF (not to include in the OPs) • Defining the data collection system for ESF (in close cooperation with evaluations) • Quarterly monitoring of indicators (early warning system for all OPs) by the central monitoring unit

  16. Thank you for your attention Szilvia HAJDU Head of Evaluation and Analysis Unit Monitoring and Evaluation Department Prime Minister’s Office Szilvia.Hajdu@me.gov.hu • SAVE THE DATE FOR THE • V. EVALUATION AND MONITORING CONFERENCE • 16-17 October 2014, Budapest, Hungary

More Related