monitoring and evaluation n.
Skip this Video
Loading SlideShow in 5 Seconds..
Download Presentation


7 Vues Download Presentation
Télécharger la présentation


- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript


  2. PART 1 GENERAL ISSUES IN EVALUATION INTRODUCTION • “Evaluation persuades rather than convinces, argues rather than demonstrates, is credible rather than certain, is variable accepted rather than compelling “ (House, 1980) • A systematic analytical assessment of performance • But also searching for explanations of performance • Has a history of unfulfilled promises ? And seen as supply driven ? Need to demystify ? • Concept is defined in multiple and even conflicting ways (OECD,1997) – reviews, scrutinies, audits, assessments, policy analysis etc

  3. INTRODUCTION (Cont’d) • A range of different M & E tools/methods/arrangements are available. Which to use and when ? ”Horses for courses • E.g. results based management, performance budgeting, performance auditing etc • Is evaluation a “discipline” or “profession” – or can “anyone” do it ? • Timing – ex ante or ex post ? Ex post may be too late ? • Scope – what is accepted as given ? e.g to what extent should objectives be questioned. Who legitimizes objectives. Who sets boundaries of the evaluation ? • Purpose - Judgment or recommendations for improvement, learning (for performance improvement) or accountability

  4. INTRODUCTION (Cont’d) • Location - internal or external to the object of the evaluation • - but even so, where located in the external/internal field ? Or joint, collaborative approach ? Not either/or. • To what extent are the major issues - institutional or management related (where to locate, incentives etc) - rather than technical or methodological (?) • that is, going beyond performance measurement to performance management • M & E is not just a box to be “plugged in” and work automatically • Quality and costs of obtaining information. Need for central information systems?

  5. INTRODUCTION (Cont’d) • The challenge of increased decentralization – who is accountable, where is the information ? Replicating M & E at the sub-national level • Issues/problems with performance indicators – who legitimizes the objectives on which they are based. Developing performance indicators before objectives are clarified. Comprehensiveness versus usability. Playing the numbers game (e.g. waiting lists, crime rates). Possible incentives for dysfunctional management behavior to meet targets. • The rise of performance benchmarking – within and between organizations and governments

  6. MONITORING AND EVALUATING WHAT ? • Programs (and policies?) • Organizational Units or sub-units • Processes, activities and systems (the “management consulting” field) • Consider in terms of the public sector production function, viz - inputs – processes/systems – outputs – outcomes – impacts

  7. WHAT IS PERFORMANCE ? • Impact – on social/economic results/position (MDGs, poverty etc) • Effectiveness – (program) outcomes • Efficiency – outputs • Service quality – includes both with efficiency and effectiveness ? • Good processes and systems • Financial performance – achieving budget targets • Inputs – do we pay enough attention to good cost information ? • We need to evaluate all of the above ?

  8. USING MONITORING AND EVALUATION (These are not mutually exclusive) • For budget decision making – improving allocations and seeking savings. But (performance oriented) budgets may not be implemented – lack of realism, inadequate budget execution systems etc • For national or sectoral planning – but here too, what about implementation ? • Need for clear link between plans and budgets. Promise in the plan what you can’t deliver in the budget ? • For performance management – of programs, activities, organizations • For establishing accountability (internal and external) and social control • All related to decision making to improve public sector performance

  9. Using Monitoring and Evaluation • Linking with decision making processes is key - budget, performance contracting, development of national and sectoral plans etc • Link with the budget is not mechanistic. M & E informs the budget process. • Which will have the greatest impact on performance (or accountability ?) e.g. in what circumstances is performance budgeting the right mechanism ? • Many country examples of high quality evaluation and extensive performance indicators which have no clear link with decision making processes • Whichever way performance information is used there is a need for targets and baselines – an often overlooked area

  10. PART 2 OECD PERSPECTIVES KEY ISSUES IN PROGRAM EVALUATION (OECD, 1997) • Gaining support from the top - involvement of elected officials, demonstrating it will “count”. Legislative as well as Executive Branch. • Generating effective demand for M & E – who wants it ? In everybody’s interest in the long run. Incentives for evaluation - “Sticks, carrots and sermons”. • Setting Realistic Expectations – evaluation will not automatically increase performance and provide new resources. Enthusiasm has been followed by disillusionment.

  11. OECD PERSPECTIVES • Systematizing the evaluation activities – making it a normal part of political and management life • Linking with the budget process – the link is not mechanistic but the budget process offers potential to improve public sector performance. Performance information informs the budget process (A record of mixed success in performance budgeting in OECD countries) • Can performance budgeting work with budget rigidity/lock in ? • Location - who does or leads it ? MOF, Planning Ministry, Presidency, independent organization (research institutes, national evaluation organization, management consulting firms), National Audit Institution, line ministries and program managers. Perhaps all of the above !

  12. OECD PERSPECTIVES • Planning evaluations – proper planning is a pre-requisite for success • Timing evaluations appropriately – fit in with policy and decision making cycle to have impact (and timing different types of evaluation differently) • Meeting user needs and ensuring relevance – evaluations must be intended to be used • Involving the stakeholders – there are always multiple stakeholders (citizens, elected officials, civil servants)

  13. OECD PERSPECTIVES • Ensuring methodological quality – quality control mechanisms, ethical standards in evaluation • Communicating the results to stakeholders – in a clear and timely way • Monitoring or follow up – ensure the results are used • Need for adequate staff and training – evaluation is not for “amateurs”(?). Some specialist skills required, but mixed teams (evaluators and subject specialists) is useful

  14. OECD COUNTRY ILLUSTRATIONS NEW ZEALAND • Major public sector reforms in 1980s focused on output budgeting and performance contracting with chief executives • Focus was more on accountability than performance improvement • Outcomes seen as the preserve of elected officials – therefore evaluation withered; loss of performance information • Input side focused on accrual accounting (and budgeting) • Now a move to revitalize program evaluation – key performance issues perceived as overlooked

  15. OECD COUNTRY ILLUSTRATIONS • “Budgeting for outputs but managing for outcomes” • Primarily the responsibility of line ministries and agencies • But mechanisms for linking with higher level government objectives • More ex post than ex ante ?

  16. OECD COUNTRY ILLUSTRATIONS AUSTRALIA • A formal (ex post) evaluation system and plan linked with the budget established in 1980s, centrally overviewed by MOF but implemented by line ministries/agencies • All programs to be evaluated on a cyclical basis. New policy proposals to provide objectives and performance measures • Evaluation reports published • Evaluation primarily the responsibility of line ministries/agencies – portfolio evaluation plans

  17. OECD COUNTRY ILLUSTRATIONS • But later de-emphasized as the New Zealand approach to output based budgeting adopted with new government from 1996. Formal evaluation requirements abandoned. Evaluation further devolved to line ministries/agencies • Policy analysis capacity of MOF weakened considerably - withering of Cabinet expenditure review process and greater reliance on private sector advice • Outcomes and outputs framework from 1999 – regular collection and reporting of performance information; emphasis on benchmarking within public sector and comparisons with private sector (for market testing). • Performance contracting for service delivery. Mainly internal to the ministry and its minister – little MOF involvement

  18. OECD COUNTRY ILLUSTRATIONS • Use of accrual accounting (and budgeting) on the input side • Now moving to outcome performance reporting and budget appropriations for outcomes • National Audit Office review critical of quality of performance information. Insufficient incentives to prepare it seriously, because central overview is lacking. • Continuing strong performance auditing role of National Audit office

  19. OECD COUNTRY ILLUSTRATIONS UNITED KINGDOM • Previously an agnostic view of evaluation – “it might lead to demands for more expenditure” • Traditionally professional resistance to any performance measurement – teachers, police etc. (“Just give us more resources”) • Previously regular but ad hoc expenditure reviews – fundamental expenditure review etc, by Treasury • Emphasis on measurable targets for each ministry/agency through Public Service Agreements (PSAs), mostly focused on inputs and processes, and service delivery. • Concern about too many indicators/targets and that they were not necessarily the right ones.

  20. OECD COUNTRY ILLUSTRATIONS • Now reduced number of indicators and targets and focus more on outcomes and outputs • Performance benchmarking is growing • Now regular two yearly spending reviews using extensive evaluation information focusing on outcomes and priorities but also examining efficiency and possible savings. • Ministry/agency prepares submission which are reviewed by Treasury, Cabinet Office and Cabinet Committee,

  21. OECD COUNTRY ILLUSTRATIONS • - resulting in a Ministerial agreement and allocation of (indicative) spending for the next three years. That is, medium-term funding provided to achieve medium term targets • Evaluation/policy issues are embedded in ministries/agencies, with Treasury playing a review role • Major evaluations in key areas – education, health, “new deal for young people” (cross cutting issue) • Influential performance auditing role of National Audit Office

  22. OECD COUNTRY ILLUSTRATIONS CANADA • In 1990s overlapping performance management initiatives – strategic planning, MBO, program reviews etc • Appeared to have little input into decision making processes • Now a close linkage with the budget, with emphasis on reallocation – Cabinet Expenditure Review Committee • Extensive annual performance reporting by line ministries/agencies, but the information appears to be not much used outside the ministries/agencies – and an annual Government wide performance report • Formal evaluation policy issued by Treasury Board requires 3-5 cycle of program evaluation by ministries • Strong performance auditing role of Auditor-General

  23. OECD COUNTRY ILLUSTRATIONS USA • A long history of published program evaluation through Government Accountability Office (GAO) • Government Performance and Results Act 1993 (GPRA) requires 5 year strategic plans and development of performance indicators over a medium term period • Used by Executive Branch in developing the President’s budget. Not used by Congress in appropriating the budget. • Program assessment and Review Tool (PART) – a formal program assessment tool covering efficiency, effectiveness and service quality, used by Executive Branch, cycle of evaluations, reinforcing GPRA

  24. OECD COUNTRY ILLUSTRATIONS • Agency Scorecards draw attention to management issues (service quality, systems, contracting out, quality of program performance information etc) SWEDEN • A long established evaluation culture – many evaluative bodies, extensive consultations on policies and programs through Commissions, which bring together all stakeholders and produce public reports to feed into political decisions • Evaluation is an accepted and understood part of political and managerial decision making

  25. OECD COUNTRY ILLUSTRATIONS • Extensive use of performance information in the budget process – annual results information supplemented by regular in depth reviews • But some issues as to whether the information is more for display than decision making. Budget decision making has been on efficiency rather than effectiveness issues, with the latter being largely handled outside the budget process (?) FRANCE • Noteworthy feature is government body – the Scientific Council for Evaluation. Evaluation is thus seen as requiring qualified professionals

  26. PART 3 REVIEW OF COUNTRY POSITIONS CURRENT POSITION IN MANY LAC COUNTRIES ? • Little information for resource allocation or performance improvement – what are we buying/achieving with this expenditure • Need to link better with the budget • Centrally driven M & E initiatives and information systems; is there enough emphasis on ministries/agencies developing their own evaluation ? • Indicator overload - e.g. volumes of unused performance indicators • Building on national planning systems – focus therefore is on impact measures • The challenge of evaluation at sub-national level

  27. REVIEW OF THE FIVE COUNTRY PRESENTATIONS BRAZIL • PPA 4 year plan, enshrined in Constitution – high level Government priorities • Will it become a rolling plan ? • Covers all public expenditures – integrating capital and current • Program focus – programs cross organizational boundaries • Seen as a key vehicle for addressing major government priorities – identifies “strategic programs” • Lack of interest by Congress in PPA and evaluation

  28. REVIEW OF COUNTRY PRESENTATIONS • Link with Executive budget not clear – even less so with Congressional budget, although law requires consistency of budget with plan • Is the basis for ministerial strategic plans (?) – but weak link with sectoral strategies • Supported by central SIGPLAN information system • Performance monitoring and evaluation capacity requires further development – few programs have adequate performance indicators • Little comprehensive evaluations of programs so far • But principle is of self evaluation, ex ante of new projects • Perhaps over-engineered – has not had a major impact on resource allocation (?)

  29. REVIEW OF COUNTRY PRESENTATIONS CHILE • Performance information in the budget process through - performance indicators developed by Budget office for ministries and agencies – reported to Congress - formal system of program evaluation, focusing on effectiveness ( - budget bidding fund (for new programs) • Thus a variety of (consistent) tools – developing a performance focus at all levels • Government wide performance measurement and evaluation system (PMES) • Indicators cover economy, efficiency (over half), effectiveness and service quality • But no mechanistic link with budget allocations • Managed by Budget office – centralized

  30. REVIEW OF COUNTRY PRESENTATIONS • Evaluations required to be public, independent, reliable, relevant, timely and efficient - Program reviews – logical framework, undertaken by qualified external panel - Program (Impact) evaluations – impact assessment, more sophisticated methodology - Comprehensive expenditure reviews – ministry/agency focus, often seeking savings • All reported to Congress • Strong integration into decision making - may impact the budget – major program redesign, program abolition, confirm program effectiveness, change program management • Extensive training of the evaluation community

  31. REVIEW OF COUNTRY PRESENTATIONS COLOMBIA • SINERGIA in operation since 1994 • To measure and track public sector performance – improve resource allocation and formulation of National Development Plan. Provide information for debate on public policies • Overseen by National Council of Economic and Social Policy • Three components - results monitoring, strategic evaluations and reporting (for accountability/social control) • More work required to link evaluation results with the budget – dual budgeting (capital and current separate) complicates this • Planning and budgeting also not adequately linked • Indicator overload – simplification undertaken so elected officials can see relation between policies/priorities and expenditures • Reclassification of expenditures also needed • Not yet applicable at sub-national government, but replication planned for municipal government • Lack of incentives for evaluation

  32. REVIEW OF COUNTRY PRESENTATIONS MEXICO • Performance based budgeting initiative • Extensive system of performance indicators – but need improvement • What use is made of the indicators? Centralized in MOF with little line ministry involvement ? • Performance indicators audited and reported to Congress (?) • 1999 Congressional mandate for program evaluations ? • Evaluation in social programs – Ministry of Social Development (SEDOSOL) M & E system – National Council of Evaluation of Social Development Policy

  33. REVIEW OF COUNTRY PRESENTATIONS PERU • MEF has commenced national M & E system (SSEGP), as part of wider effort to promote results based management • Reflects need to improve quality of public expenditures – little information on results of expenditures • Budget linkage not yet well developed • Will include performance indicators, evaluations, and audits • Performance indicators developed at different levels of national government sector, organizational unit, activity etc (need to reduce the number of indicators) • Impact indicators not owned by any institution; therefore how to get action ? • Indicators not yet developed for sub-national government, where much service delivery takes place • Need for base-line information recognized • Budget rigidity makes expenditure reallocations difficult