1 / 32

Monitoring of Cooperation Strategies Workshop Latin America Division, Bogota October, 2012

Federal Department of Foreign Affairs FDFA Swiss Agency for Development and Cooperation SDC Quality Assurance. Monitoring of Cooperation Strategies Workshop Latin America Division, Bogota October, 2012. Content of the presentation. Introduction SDC monitoring concept Some issues

Télécharger la présentation

Monitoring of Cooperation Strategies Workshop Latin America Division, Bogota October, 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Federal Department of Foreign Affairs FDFASwiss Agency for Development and Cooperation SDC Quality Assurance Monitoring of Cooperation Strategies Workshop Latin America Division, Bogota October, 2012

  2. Content of the presentation • Introduction • SDC monitoring concept • Some issues • Link Monitoring Cooperation strategy – Projects • Fragile contexts • Examples (emerging new practice) • Concluding remarks

  3. Initial remarks – the triangle – key aspects INTRODUCTION

  4. Initial remarks • Team effort • Let‘s use our PCM knowledge, our country expertise and thematic skills… • Management on board! • Purpose of monitoring, scope, resources, use… • Important: a high level of ambition regarding quality • More important: to start and maintain a monitoring practice than a level of ambition which is too high • Not a new activity: learning from own practice • Organize exchange of experience (e.g. within a region)

  5. Annual Reporting Linkages between CS, CS Monitoring and Annual Reports Results-oriented Managementof CS  Focus on Outcomes • Results achieved, reported with reference to CS and RF; steering • Series of annual results reporting + evaluation(s) = input for new CS • Outcomes, Output Information per domain, country and Swiss Portfolio level • Portfolio management, context information CS • Interventions/Modalities defined • Results Framework established • CS Monitoring Concept/Plan Monitoring of CS

  6. Key aspects of a Monitoring system

  7. Key aspects (cont.)

  8. Some elements taken from the …. SDC MONITORING CONCEPT

  9. CS Monitoring: Levels of Observation Contribution III. Country-level outputs & outcomes (incl. specific context elements relevant to Swiss portfolio) I. Portfolio management by SCO II. Swiss portfolio outputs & outcomes Wider CountryContext (MERV, scenarios) Harmonisation and Alignment

  10. Portfolio Management: key elements • Portfolio Management refers to elements that are relevant for quality/plausibility of results achievement, like... • ...Portfolio composition, mix of operations/projects • ...Managing relationships with partners • Developement/Aid effectiveness agenda • Policy dialogue • ...Learning and knowledge management • Knowledge and skills for managing the portfolio (including innovative elements); HR management • Regional learning and exchange • ...Approaches, strategies, aid modalities per Domain • ...Financial management: allocations per Domain

  11. Key questions of "country level" • Is the partner country moving towards the set objectives relevant for a CS component (in particular country level outcomes, field 3 of CS RF, right column AR)? • Do these objectives keep their validity or do they need to be revised (from the donor's point of view)?

  12. Key questions of “Swiss portfolio level" • Is the Swiss portfolio evolving in line with the set objectives in the CS (RFfield 1, field 2; left column AR)? • Do observed portfolio outputs and outcomes relate/ contribute to country outputs and outcomes (main findings: results statement AR) • Do the expected portfolio contributions to country level outputs and outcomes keep their validity or need to be revised (fields 1 and 2 of CS RF)?

  13. Key questions of "portfolio management level" • Does the portfolio management support result achievement? • To what extent portfolio management relates to national processes? • Is the portfolio management sensitive to risks, opportunities and context/scenario development? • specific component of the monitoring system; based on principles (or goals) in ch. 6 of the CS (and/or RF, field 4) • E.g. sector program support • Capacity development goal (field office staff, partners)

  14. Yearly outcomes? - aggregation – plausibility in results statements SOME ISSUES

  15. Yearly Outcomes?!

  16. Aggregation Aggregation in a rigorous sense not a standard • Not equally achievable in all sectors and/or countries • Indicators not always "scalable" • Mix of instruments; joint programmes • Contribution to country results • However, key indicators in Results frameworks and project monitoring make sense, e.g. Universal Impact Indicators (DCED) • Inputs of Thematic networks and SECO Instead of aggregation: Meaningful synthesis/ integration of relevant information (not excluding aggregation where feasible)

  17. Plausibility vs. strict evidence • comprehensive baseline • sophisticated logframe - qualitative • business plan assessment • aggregation and attribution SDC MCS concept - results framework - elements of baseline/target values - contribution - assessment based on plausibility (country, portfolio, portfolio mgt.) > RESULT STATEMENT

  18. Result statement (Annual report) • RS = product of assessment process, based on outcomes, outputs (Swiss portfolio, country development) and context information of a Domain of intervention • Key element of our results reporting • Time series of RS (see appendix of AR) • Source for higher level synthesis of results • Features of a RS (see guidance AR) • Qualitative and quantitative synthesis of results • Enriched by elements of process/context • Outstanding aspects of transversal themes • Highligthing contribution to country development

  19. Managing indicators – developing capacities – using information LINKS CS - PROJECTS

  20. Linkages between CS (Swiss portfolio) and Project Level Strategic Level ResultFramework MonitoringCS Annual Report relating to CS CS StrategyDomains 1) Indicators 2) Guidance, Support 3) Sources of Information Operational Level Project Planning Annual Progress Reports by Partners Project Monitoring Logframe

  21. Links Project - CS monitoring • (1) Define types of indicators on CS and project level; links between those indicators • Results framework: effectiveness/impact and outreach indicators; population/ organization level; distribution (Swiss portfolio; country results) • Project logframes: in addition indicators covering quality, sustainability, … • (2) Portfolio analysis: performance/capacities regarding monitoring in projects, SCO support/peer exchange • Use data from and invest more in „Feeder projects“: (1) strong relation to strategic outcomes of the DoI, (2) financial volume (3) stakeholders involved, (4) potential for scaling up • Use certain fields of observation for the whole portfolio, e.g. gender, inequalities, …

  22. Links Project – CS monitoring: (3) Sources • Progress reports by partners • Substance of information more important than harmonizing Reporting periods • Field visits SCO staff: adress results issues • End of phase reports: plan them (over CS period), support them, use them • Reviews/Evaluations: project  domain? • Complement with case studies, beneficiary assessments • Get involved as SCO team member • Get others involved (government, donors, ....) • National sources for country results/indicators (no generation of data)

  23. Availability of data (national sources) • Assess availability/quality of country data • Use country data to the extent possible • Might include joint support to national systems • Clarify in the Annual report when data are missing, qualify country data • Identify alternative sources • Shadow reports NGO‘s, Think tanks • Multilateral sources • Resource persons, joint assessments

  24. Types of results – specific elements of monitoring (preliminary thoughts) FRAGILE CONTEXTS

  25. Fragile contexts (Monitoring Concept, Implementation Plan Fragility) • Hypothesis of change  to be discussed in AR, MTR • Strengthened context monitoring/assessment (frequency of MERV); SDC + others Offices • Scenarios  to be reflected in the AR exercise • Type of results, portfolio mix  AR, MTR • Transparent rating in AR (regarding results achievement) • Indicators on implementation modalities (e.g. Dialogue with Multilaterals; CSPM in projects; diversity/staff composition on all levels; …)

  26. Country-level reference frame for SDC portfolio: mainly set by international community or proxies Country-level reference frame for SDC portfolio: Proxies for outcomes of sectors or themes SDC partners are working in Country-level reference frame for SDC portfolio: MDGs, PRS or social inclusion strategies, sector policies SDC-Portfolio:- Process focus (peace and state building)- Capacity building focus- Facilitation Focus Fragile StatesPublic sector close to collaps, high probability of violence Partner countries with critically lowpublic sector management capacities, weak or no PRS and sector policies, little or no ownership for Paris/Accra Agendas Partner countries with low to middlepublic sector management capacities, well developed PRS and sector policies and ownership forParis/Accra Agendas Country level: Context specific MCS SDC-Portfolio:- Capacity building focus - Process focus- Facilitation Focus SDC-Portfolio:- Facilitation focus - Capacity building focus- Facilitation Focus Increasing Fragility Increasing Stability

  27. Country level: Context specific MCS (2) Type of results (outcome level) on population level Type of results on organizational level

  28. Emerging good practice at SDC EXAMPLES

  29. Examples of monitoring systems • Summary notes Afghanistan, Moldova (available) • Structured according to the key aspects • OTP, Ukraine (forthcoming, t.b.c.) • SCO encouraged to elaborate a summary note of their Monitoring system • Supports definition of Monitoring system • Focal person? • To be shared with new staff/stakeholders/partners (country) • Useful for sharing with other Field offices • E.g. Moldova: monitoring matrix

  30. Thanks a lot for your attention, here some CONCLUDING REMARKS

  31. Concluding remarks • A monitoring system ≠ data collection system…rather: it should produce relevant information on implementation of CS • A monitoring system requires… • definition of objectives (monitoring functions) • short concept/summary note shared with stakeholders • set of methods of data collection • link to existing process (annual and per period), TRR • team events: use of information and decision making • resources (Swiss programm, country?) • Use first AR exercise with new RF to adjust it • Improve over time (= clarify, create ownership, simplify)

More Related