1 / 61

PROJECT CYCLE MANAGEMENT (PCM)

Outline of Presentation Steps in PCM Logical Framework Monitoring and Evaluation Fundamentals Result Based Management. PROJECT CYCLE MANAGEMENT (PCM). Refernces. Project Cycle Management Manual, European Commission, 2005. M&E Fundamental, Global Health E-learning, USAID.

hollye
Télécharger la présentation

PROJECT CYCLE MANAGEMENT (PCM)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outline of Presentation Steps in PCM Logical Framework Monitoring and Evaluation Fundamentals Result Based Management PROJECT CYCLE MANAGEMENT (PCM)

  2. Refernces • Project Cycle Management Manual, European Commission, 2005. • M&E Fundamental, Global Health E-learning, USAID. • Role of Evaluation in USAID, 1997

  3. STEPS IN PCM

  4. What is PCM • PCM: process by which projects are designed, implemented and evaluated • Using PCM can help ensure that projects are: • Relevant • Well designed • Transparently managed • Lead to real and lasting benefits

  5. PCM

  6. Steps of the PCM

  7. 1. Stakeholder Analysis Stakeholder is any individual, group or organization, community, with an interest in the outcome of a programme/project. Key Question Whose problems or opportunities are we analyzing? Who will benefit or loose-out, and how, from a potential project intervention?

  8. Problem Analysis • Problems Analysis visually represents the causes and effects of existing problems in the project area, in the form of a Problem Tree. It clarifies the relationships among the identified problems. • Step 1: To openly brainstorm problems which stakeholders consider to be a priority. • Step 2: Begin to establish a hierarchy of cause and effects:

  9. Relationships of Problems? I am not sure what to do in this work My salary is low I get poor respect from the boss I am lack of expertise/skills My company is not getting profit My post is not high yet My poor communication to boss We have different ways to think I am lack of training Problem Analysis Simply connect causes and effects by directly“Because” EFFECT I am not motivated to work CAUSE

  10. 4Objective Analysis Objective Analysis clarifies the means-ends relationship between the desirable situation that would be attained and the solution for attaining it. This stage also requires an Objective Tree.

  11. Reformulating of Problems? Not sure what to do in this work Know what to do well Salary is low Salary is increased Respected and encouraged by boss Poor respect from the boss Lack of expertise/skills Adequate skills/expertise My company is not getting profit My company is getting profits Get promoted Not high post yet Better communication to boss Poor communication to boss Understanding of his/her perspective Different ways to think Lack of training Get trained Objective Analysis Results I enjoy working and am so motivated I am not motivated to work Means

  12. Levels of solidwaste augment Households and factoriespollute water Treatment Plants are not up to standards Ecosystemdeteriorating Prevalence of diseases Reducedfish stock Solide wastesreduced Households and factoriesreduce pollution Standards up graded Ecosystemprotected Prevalence of diseasesreduced Fish stock for eatingmaintained or increased Fromproblem to objectives Analysis

  13. 5.Project Selection Project Selection is a process in which specific project strategies are selected from among the objectives and means raised in Objectives Analysis, based upon selection criteria. How to select the project • Divide the objective tree into different clusters of objectives • Name all clusters • Remove impossible one to achieve • Set criteria to make the final selection Needs, Priority, Gender, Environment, Difficulty, Budget, Risks, etc,,,,

  14. Management tool used to improve the design of interventions, most often at the project level. • It involves identifying strategic elements (inputs, outputs, outcomes, impact) and their causal relationships, indicators, and the assumptions or risks that may influence success and failure. • It thus facilitates planning, execution and evaluation of a development intervention. Logical Framework

  15. A logframe can look something like this…. Objectives & activities Indicators Means of verification Assumptions Goal Purpose Outputs Activities Means Cost Also known as the ‘logical framework matrix’. Started in 1960s by USAID. Now used by most major donors. Gives a clear, simple and concise summary of what the project will achieve.

  16. Organisations Using the Logframe Developed by USAID in 1960s to be developed by various UN agencies and adopted by GTZ in 1980s • USAID, USA • GTZ, Germany • DfID, Great-Britain • NORAD, Norway • DANIDA, Denmark • AUSAID, Australia • Intercooperation, Switzerland • Ministry of Foreign Affairs, France • DGCD, Belgium • European Commission • DGCS - Min. of For. Aff., Italy • ICAX - Min. of Industry, Spain • SIDA, Sweden • UNIDO, Vienna • FINNIDA - Min. of For. Aff., Finland • HELLASCO, Greece • WWF • Int. Federation of Red Cross • UNDP • FAO

  17. Purposes of a Logical Framework • A systematic tool – • for designing, planning, implementing, and monitoring and evaluating a project (or program). • A tool for organizing thinking— • for relating inputs to the implementation of activities, activities to the production of outputs, outputs to the achievement of a defined purpose, and purpose to a high-level goal or impact. • A tool for identifying and assessing risks — • by listing critical assumptions inherent in project design and implementation. • A tool for measuring project progress — • through objectively verifiable indicators and means of verification. • A tool for developing consensus and communicating a project’s intent and strategy

  18. Advantages of the logicalframework • Problems are analysed systematically • The objectives are clear, logical and measurable • The risks and conditions for success of a project are taken into account • There is an objective basis for monitoring and evaluation Yourprojectproposalwillbecoherent

  19. LogFrame Variations Don’t over-focus on the language and the variations of the various LogFrame matrix models. The important lesson is to learn to think through projects using a logic model.

  20. The Logic of LogFrames If the OBJECTIVES are accomplished; Then this should contribute to the overall goal If DELIVERABLES are produced; Then the OBJECTIVES are accomplished If the ACTIVITIES are conducted; Then RESULTS can be produced If adequate RESOURCES/INPUTS are provided; Then the ACTIVITIES can be conducted

  21. Writing Description Statemetns The broad development impact to which the project contributes – at a national or sector level Statement Wording: “To contribute to…” The development outcome at the end of the project – more specifically the expected benefits to the target group(s) Statement Wording: “Increased, improved, etc.” The direct/tangible results (goods &services) that the project delivers, and which are largely under project management control Statement Wording: “delivered/produced/conducted, etc.” The tasks (work program) that need to be carried out to deliver the planned results Statement Wording: “Prepare, design, construct, research, etc.”

  22. Example of Logframe

  23. A LogFrame Matrix Examplepartial build out Goal Objectives/Outcomes Deliverables/Outputs Activities

  24. 6. Means of Verification (MV) Tools or means to obtain the information required by the indicators Include: • project documents • field verification • ad-hoc studies

  25. Important Assumption • A necessary condition for the achievement of results at different levels. Assumptions are external factors that have the potential to influence (or even determine) the success of a project, but lie outside the direct control of project managers Key points in setting Assumption • Should be relevant and probable • If an assumption is not important or almost certain: Do not include • If an assumption is unlikely to occur: Killer assumption – abandon project

  26. Risk: A Definition • A potential event or occurrence beyond the control of the programme that could adversely affect the achievement of the desired results • A threat to success • Not just the negative of an assumption • A trigger for reconsideration of strategic direction

  27. 7Workplan • The work plan is prepared by the project implementers, based on the PDM and other information. • It is an effective tool for project implementation and management, and provides important data for monitoring and evaluation of the project.

  28. 7. Workplanformat Example: Format of Work Plan

  29. Monitoring and Evaluation Fundamentals

  30. Definitions • Monitoringis the routine reporting of data on program implementation and performance • includes routine data collection and reporting on monthly, quarterly, or yearly basis • Has the program been implemented according to the plan? • Monitoring involves counting what we are doing. • Monitoring involves routinely looking at the quality of our services. • Evaluationis the periodicassessment of program impact at the population level and value • Evaluation is the use of social research methods to systematically investigate a program’s effectiveness. • Evaluation requires a special study design. • Evaluation sometimes requires a control or comparison group. • Evaluation involves measurements over time. evaluation differs crucially from monitoring in that the goal of evaluation is to determine how much of the change in outcome is due to the program or intervention

  31. 12 Components of M&E System

  32. Purpose of M&E Program Improvement Share Data with Partners • “Working smarter, not harder” Reporting/ Accountability

  33. Purpose of M & E • measure program effectiveness. • Help managers and implementers acquire the information and understanding they need to make informed decisions about program operations • Used to demonstrate to planners and other decision-makers that program efforts have truly had measurable impact on the outcomes of interest. • helps you make the most effective and efficient use of resources. • helps you determine exactly where your program is right on track and where you need to consider making corrections. • helps you come to objective conclusions regarding the extent to which your program can be judged a “success” • Provide data necessary to guide strategic planning, to design and implement programs and projects, and to allocate, and re-allocate resources in better ways.

  34. Monitoring and Evaluation ? OBJECTIVES Effectiveness INPUTS Efficiency Input Monitoring PROCESS Process Monitoring OUTPUTS Outputs Monitoring OUTCOMES Outcomes Monitoring and/or Evaluation Impact Monitoring and/or Evaluation IMPACTS

  35. Links between Logframe Approachand evaluation criteria

  36. The linkages Special Survey OUTPUTS ACTIVITIES OBJECTIVES GOAL OUTCOMES IMPACT • Service Provision • Service Delivery • Participants trained • Media Productions • Community Outreach • Beneficiaries reached • Policies developed • Income Statement • IEC distributed • Statement shared • Campaigns organized • Coverage • Population/Audiences • Knowledge • Attitude • Behaviors • Practices • Decision Makers • Policy reformed • Norms changed • Structure Changed • Improved quality of social, legal and health services delivered • Population/Audiences • Reduced Mortality • Reduced Morbidity • Quality of life • Adequate housing Rights • Human Rights • Income Increased • GDP • Poverty reduction • Gender Equity

  37. What are the different type of indicators?

  38. The linkages GOAL IMPACT OBJECTIVES OUTCOMES ACTIVITIES OUTPUTS RESOURCES INPUTS

  39. Evaluation Purposes • Explain unexpected results (positive or negative). • Determine if customer needs are being met . • Assess net impacts of project/program activities. • Identify unintended impacts. • Explore special issues such as sustainability, cost effectiveness, relevance. • Make action recommendations for program improvement. • Distill lessons for application in other settings. • Test validity of hypotheses and assumptions underlying results frameworks.

  40. Types of Evaluation (Responsibility) USAID evaluations can be categorized into several types based on who is conducting them: • Internal or self-evaluations are conducted by the operating unit or agency implementing the activity or program being assessed. • External evaluations are conducted by an independent office or experts not directly associated with the activity or program. • Collaborative evaluations are conducted jointly by more than one office, agency, or partner. • Participatory evaluations are conducted by multiple stakeholders, often in a workshop format with the help of a facilitator. Stakeholders include representatives of customers or beneficiaries, as well as sponsoring donor agencies, implementing agency staff, and others with a stake in the program. The stakeholders have active participation in all phases of the evaluation, including planning, data collection, analysis, reporting, dissemination and follow-up actions.

  41. Types of Evaluation (Timing) • Ex-ant evaluation ( Baseline) assesses the project/programme at the beginning of implementation • Interim evaluation (Mid Term) takes place at one point during the life of a project/programme, usually mid-term, and assesses the likelihood of achieving the objectives • Terminal evaluation (End line/ Final) assesses the progress made towards achieving the objectives at the end of a project/ programme • Ex-post evaluation (Impact) assesses the impact of a project/ programme after its completion.

  42. Tradeoff among methods

  43. 5 Main Criteria for Evaluation • Efficiency: The productivity in project implementation. The degree to which Inputs have been converted into Outputs. • Effectiveness: The degree to which the Project Purpose has been achieved by the project Outputs. • Impact: Positive and negative changes produced, directly or indirectly, as a result of the Implementation of the project. • Relevance: The validity of the Overall Goal and Project Purpose at the evaluation stage. • Sustainability: The durability of the benefits an and development effects produced by the project after its completion.

  44. Data collection Methods

  45. Tips for Writing an Effective Report • Keep the report short—preferably under 30 pages— and always include an executive summary. • Enliven the report with true-to-life quotes, anecdotes, short case studies, questions-and-answers, and photographs. • Make the report more powerful by using active voice and present tense, featuring the most important information first, and highlighting key points (in boxes, bullets, bold fonts). • Use graphics—they can present lots of data in a small space, illustrate data patterns, highlight important comparisons, and have impact. • Make it appealing by using attractive layouts, desktop publishing, and high-quality materials. • Clearly specify the recommendations for action— they are the most critical component of the evaluation report. Effective recommendations don't simply happen—they must be carefully developed and presented. Try to avoid “surprises” and make recommendations realistic and easy to understand.

  46. M&E Plan Components • M&E Plans: Document describing all M&E activities in a program • Introduction • Program context (National, Community based) • Purpose of Plan • Program Description • Goals and objectives • M&E Frameworks • Conceptual, Logic, Results • Indicators • Presented in a both a Matrix & Indicator Reference Sheets • Data sources, collection & reporting systems • Plans for data use & dissemination • Information Use Mapping Tool as an option • Capacity needs for Plan implementation • Funding, TA, staff, equipment (computers, GPS) • Analysis of constraints & potential solutions • Plans for demonstrating program impact • Mechanism for Plan updates

  47. Conceptual Framework • A conceptual framework (CF) is the theory that links an effect (outcome or impact) to a group of determinants in a set of known causal relationships. Purpose: • Provides a perspective for understanding program objectives within a complete context of relevant factors in a program’s operating environment • Clarifies analytical assumptions and their implications for program possibilities or limitations, as well as measuring and analyzing that degree of success

  48. Conceptual Frameworks(a.k.a., research or theoretical frameworks) A conceptual framework (CF) links an effect (outcome or impact) to a group of determinants in a set of known causal relationships. Diagram that identifies and illustrates the relationships between all relevant systemic, organizational, individual, or other salient factors that may influence program/project operation and the successful achievement of program or project goals. M&E Purpose: • To show where program fits into wider context • To clarify assumptions about causal relationships • To show how program components will operate to influence outcomes • To guide identification of indicators • To guide impact analysis (causal pathways)

  49. Nutritional Status Dietary Intake Health Access to food Care practices Health services and health environment Care resources available Example of CF: Nutrition

More Related