1 / 78

M&E Framework and Tools and Development Evaluation

Professional Training. M&E Framework and Tools and Development Evaluation. 1. Introduction to the course. To intensify the M&E skills and expertise of researchers and improve the impact on general public and development. .

benjy
Télécharger la présentation

M&E Framework and Tools and Development Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Professional Training M&E Framework and Tools and Development Evaluation

  2. 1. Introduction to the course To intensify the M&E skills and expertise of researchers and improve the impact on general public and development. • Building the capacity and skills of researchers on M&E system and development evaluation • Strengthening the capacity of researchers to be able to develop M&E framework and tools • Strengthening the capacity of researchers to be able to conduct program and project evaluation • Equipping researchers with M&E skills and expertise • Become familiar with concepts and practices of M&E • Be able to develop M&E framework and Tools • Be able to conduct program/project evaluation • Equipped with M&E Skills and expertise • M&E Specialist • Professional Research Consultant

  3. 1. Introduction to the course • M&E Framework and Tools Development • Module 1: M&E Rapid Assessment • Module 2: M&E framework development • Module 3: Monitoring tools development • Module 4: M&E Tools Pilot and Review • Module 5: Finalized M&E Framework and Tools • Module 6: Roll-out Plan and M&E Costed Capacity Plan • Development Evaluation • Module 1: Objectives of Evaluation • Module 2: Focus and Scope • Module 3: Select Indicators • Module 4: Chose Study Design • Module 5: Data collection Plan • Module 6: Data Enumerators Train • Module 7: Data Collection/Field Work • Module 8: Data processing and analysis • Module 9: Data organization and interpretation • Module 10: Evaluation Report Writing

  4. Monitoring and Evaluation ? OBJECTIVES Effectiveness INPUTS Efficiency Input Monitoring PROCESS Process Monitoring OUTPUTS Outputs Monitoring Outcomes Monitoring and/or Evaluation OUTCOMES Impact Monitoring and/or Evaluation IMPACTS

  5. II. M&E Framework and Tools Development

  6. M&E Frameworks Interaction of various factors Logically linked program objectives Logically links inputs, processes, outputs, and outcomes

  7. Community & Health Actors Commune Committee for Women and Children develop & manage Systems that they use to deliver Activities/services for communities Conceptual Framework Outputs that lead to Resulting in: Health outcomes Other outcomes which in turn contribute to Impacts on health and reduction of vulnerability of OVC Community action and results for health and non health

  8. Result Framework

  9. Logical Framework

  10. M&E Frameworks

  11. M&E Framework Goal: Strengthen the coordination, systems, coverage and quality, of services needed to mitigate the impact of HIV on the lives and futures of Cambodian children, while also addressing the underlying issues to vulnerable children. Impact Indicators: % of Birth Registration, Proportion of Current School attendance , stunt, underweight and wasted

  12. M&E Tools Development • Select indicator standard • Reporting Format • Instruction Guide • Data Flow and Management • M&E Data Collectors Train • Piloting and updating • Roll out plan • Data Base System • Data Use Plan

  13. Indicator Standards A good Indicator should meet the following six standard; • The indicator is needed and useful • The indicator has technical merit • The indicator is fully defined • Its feasible to measure the indicator • The indicator has been field tested or used operationally. • The indicator set is coherence and balanced ( relevant to indicator sets only)

  14. Indicator Standards • STANDARD 1: THE INDICATOR IS NEEDED AND USEFUL • Question 1: Is there evidence that this indicator is needed at the appropriate level? • Question 2: Which stakeholders need and would use the information collected by this indicator? • Question 3: How would information from this indicator be used? • Question 4: What effect would this information have on planning and decision-making? • Question 5: Is this information available from other indicators and/or other sources? • Question 6: Is this indicator harmonized with other indicators?

  15. Indicator Standards STANDARD 2: THE INDICATOR HAS TECHNICAL MERIT • Question 1: Does the indicator have substantive merit or technically sound and significant or measure something significant and important within particular field • Question 2: Is the indicator reliable and valid? • Question 3: Has the indicator been peer reviewed?

  16. Indicator Standards STANDARD 3: THE INDICATOR IS FULLY DEFINED • Title and definition • Purpose and rationale • Method of measurement • Data collection methodology • Data collection frequency • Data disaggregation • Guidelines to interpret ad use data • Strengths and weaknesses • Challenges • Relevant sources of additional information

  17. Indicator Standards STANDARD 4: IT IS FEASIBLE TO COLLECT AND ANALYSE DATA FOR THIS INDICATOR • Question 1: How well are they systems, tools and mechanisms that are required to collect, interpret and use data for this indicator functioning? • Question 2: How would this indicator be integrated into a national M&E framework and system? • Question 3: How what extend are the financial and human resources needed to measure this indicator available? • Question 4: What evidence exists that measuring this indicator is worth the cost?

  18. Indicator Standards STANDARD 5: THE INDICATOR HAS BEEN FIEL-TESTED OR USED OPERATIONALLY • Question 1: To what extend has the indicator been field-tested or used operationally? • Question 2: Is this indicator part of a system to review its performance in ongoing use?

  19. Indicator Standards STANDARD 6: THE INDICATOR SET IS COHERENCE AND BALANCED (Relevant to indicator sets only) • Question 1: Does the indicator set give and overall picture of the adequacy or otherwise of the response being measured? • Question 2: Does the indicator set have an appropriate balance of indicators across elements of the response? • Question 3: Does the indicator set over different M&E levels appropriately? • Question 4: Does the set contain an appropriate number of indicators?

  20. Indicator Protocols INDICATOR PROTOCOLS REQUIRES INDICATOR PROTOCOLS Consistency or dependability of data and evaluation judgments, with reference to quality of the instruments, procedures and analysis used to collect and interpret evaluation data • Definition • Measurement • Strengths • Limitations • Reliability • Precision • Validity • Objective • Owned • Accessible • Useful Indication defines clearly what we should be measured. It defines the variables that help measure change within a given situation as well as describe the progress and impact. The extent to which something is reliable and actually measures up to or make a correct claim. The process of cross-checking to ensure that the data obtained from one monitoring method are confirmed by the data obtained from a different method

  21. M&E Framework & Tools M&E Rapid Assessment Roll-out Plan and M&E Costed Capacity Plan M&E Framework Development M&E FRAMEWORK & TOOLS DEVELOPMENT Finalize M&E Framework and Tools Monitoring Tools Development M&E Tools Pilot and Review

  22. Instruction Guide • What is instruction guide? • Instruction guide is a reference tool formulated tends to provide clear explanation on how to accurately complete the reporting format. • How to develop instruction guide? • Identify purpose of the instruction guide • State purpose of the reporting form • Data sources • Who prepare the report • Frequency of reporting • Reporting period • Name of agency completing the report • District • Province • Indicators

  23. Instruction Guide Indicators: • For example: Total number of OVC whose households received economic support (income generation activities, livelihood support, regular cash transfer) • Write the total number of OVC whose households received economic support during the reporting period. Definition: Economic support (IGAs and livelihood) has been defined as: • Home gardening • Animal husbandry • Provision of agricultural seeds • Small business development • Money management training • Emergency cash support • Regular cash transfers • Access to loan/microfinance • Other Disaggregation: • This data is disaggregated by gender. Write the total number of male OVC in the “Male” column and the total number of female OVC in the “Female” column. Then write the total number of OVC (male + female) in the “Total” column.

  24. Data Flow • When mapping the flow of data, please consider the following issues: • Who will be responsible for data collection? • Who will provide the data? • Who will be responsible for supervision of data collection? • Who will be responsible for compiling and aggregating data? • How often are data collected, compiled, reported, and analyzed? • How are data sent from one level to the next? • How is feedback on reported data provided?

  25. Data Flow Ministry of Planning Ministry of Social Affairs, Veterans and Youth Rehabilitation (MoSVY) (Child Welfare Department) MoH Quarterly PoSVY Report on OVC Quarterly Annual NOVCTF Provincial Department of Planning Annual Provincial Department of Social Affairs, Veterans and Youth Rehabilitation (PoSVY) PHD Quarterly POVCTF DoSVY Law Enforcement (police, prison, courts ) Annual Quarterly CCWC Commune Council (via CDB) Service Providers (NGOs) Alternative Care Centers Youth Rehabilitation / Drug Rehabilitation Annual Village Council (via CBD) Data flow Feedback Supportive Supervision

  26. Identify R&R of Key Players When developing role and responsibility of all key players involve in data collection, some important point that you should consider: • What type indicator they need to collect and report? • How many indicator they need to collect and report? • How they collect those data (source of data – registration book)? • Which reporting form they use? • How frequency that they should report – when? • Who they should report to?

  27. Source of data error • Transposition—An example is when 39 is entered as 93. Transposition errors are usually caused by typing mistakes. • Copying errors—One example is when 1 is entered as 7; another is when the number 0 is entered as the letter O. • Coding errors—Putting in the wrong code. For example, an interview subject circled 1 = Yes, but the coder copied 2 (which = No) during coding. • Routing errors—Routing errors result when a person filling out a form places the number in the wrong part or wrong order. • Consistency errors—Consistency errors occur when two or more responses on the same questionnaire are contradictory. For example, if the birth date and age are inconsistent. • Range errors—Range errors occur when a number lies outside the range of probable or possible values.

  28. What to do when mistakes • First, determine the source of the error. • If the error arises from a data coding or entry error • If the entry is unclear, missing, or otherwise suspicious • Once the source of the error is identified, the data should be corrected if appropriate.

  29. Points to consider when providing feedback • Feedback should be constructive and not punitive • Feedback should be useful to data collectors and help them improve their work • Errors should be pointed out and corrected • The M&E supervisor should talk to the data collector to find out the cause of the error so it can be prevented in the future • The M&E supervisor should discuss how data quality and reports can be improved in the future

  30. Points to note when providing supportive feedback • Provide both positive and negative feedback (e.g. you do X very well but can improve Y) • Provide feedback in a timely manner • Help data collectors understand the problem so they know how to correct it in the future • Be helpful and collaborative

  31. Why is it important to provide supportive feedback • Builds relationship between data collectors and users at all levels • Important element of management and supervision • Leads to greater appreciation of data • Improves data quality • Improves information use • Improves service delivery and benefits the target population and the community • Improve program reporting- data collectors understand trends in data and understand reasons behind numbers • Incentivizes and motivates data collectors

  32. Pilot M&E Tools • Set criteria for selecting pilot province • Provide training on M&E reporting tools to all data collectors • Provide on the job training to all data collectors

  33. Pilot M&E tools review • Objective: • Aim to take an in-dept look at the quality of the data that was collected during the pilot period and to assess the systemic factors that affect M&E performance and to gather direct input on the M&E tools and system.

  34. Pilot M&E tools review • Step in conducting the review: • Develop assessment tools • Data transmission, accuracy, processing and analysis • Data transmission • Data accuracy • Data processing and analysis • Data use • Some qualitative questions added • Provide training to assessment team • Conduct assessment • Conduct consultation meeting on the findings

  35. Finalize Framework and Tools • Key point affecting the finalization of M&E framework and mechanics • Indicators • Does these indicators are feasible to collect? • Does these indicators are feasible to analyze and use? • Is there any evidence that financial and human resources are available to allow an indicator to be measured and that the benefits of measuring the indicator are worth the costs? A good indicator needs to be one that is feasible to measure with reasonable levels of resources and capacity.

  36. Finalize Framework and Tools The situation may change meaning that an indicator needs to be changed, discarded or added. • M&E system mechanics • Does the data collection tools are applicable? • Does the reporting formats are applicable? • Does the instruction guide (guideline) is user friendly? • Data management process • How well functioning of the data flow of the system? • Does existing human resource have an appropriate capacity to manage the data flow? • How clear the roles and responsibility of department or person involved in M&E system? • Does the frequency of data collection and reporting are appropriate at each level?

  37. Finalize Framework and Tools • Revise M&E framework, with revised indicator, M&E mechanics, and data management process • Conduct consultative meeting among M&E team and relevant stakeholders to finalize M&E framework and system • Get approval from top level of management (decision makers, policy makers).

  38. Purpose of M&E/Data use Program Improvement Share Data with Partners Reporting/ Accountability

  39. Development Evaluation

  40. Reasons for Evaluation/Research • Unfair Compensation and worsen living condition • Loss of job • High Service cost for relocated site • There is no available legal, social and health services • Positive Impact of • development • Beautification • Development • Employment • GDP Growth • Economic Growth • Survive people from Slum • Negative impact of development • Human Rights Violation • Inadequate housing rights • Unfair Compensation • Unfair development • Inequality of profits distribution

  41. Reasons for Evaluation/Research

  42. Reasons for Evaluation/Research

  43. Steps for Evaluation and Research 1 Objectives 2 Focus & Scope 3 Select Indicators 4 Chose Study design 5 Data Collection Plan 6 Data Enumerators Train 7 Data collection/Field Work 8 Data Cleaning & Verification 9 Data Processing & Aggregation 10 Data Analysing & Organization 11 Data Interpretation & Report 12 Data Use and Data Translation

  44. Objectives • The overall objective of the program evaluation of HRTF is to assess the social economic impact of Cambodia Forced eviction in urban areas of Phnom Penh Municipality. • The specific objectives of the program evaluation is to know the status of economic, education, health, employment, food security and environment of threatened and relocated communities.

  45. Scope and Focus Poverty and Quality of live among relocated Households and threatened Households

  46. Selected Indicators

  47. Study Design • Qualitative and quantitative study design (Cross Sectional Study) • Household Survey (Cluster Sampling-Lot division) • Key Informant Interview(KII)-Relevant Stakeholders • Focus Group Discussion (FGD)-RS and TS HH • Desk Study and Literature Review • Cambodia Legal Frameworks • National and International Research Findings • NSDP and JMI 2009-2013, MoP • Pro-Poor Policy and National Safety Net Strategy, CoM • HRTF Baseline Survey 2010 • HRTF Program and strategy documents • HRTF Strategic Plan 2011-2015 • CCHR Survey on land and housing Issues 2011 • Draft of National Housing Policy 2011 • Country Report _Special reporters 2009, 2010, 2011 • Others

  48. Sample size Calculation

  49. Confidence and Precision • Confidence Level: The standard confidence level is 95%. This means you want to be 95% certain that your sample results are an accurate estimate of the population as a whole. • Precision: This is sometimes called sampling error or margin of error. We often see this when results from polls are reported. • Confidence Interval: We can say that we are 95% certain (this is the confidence level) that the true population's average salary is between 1,800 and 2,200 (this is the confidence interval).

  50. Sample size Calculation N n= ---------- 1+(N(e)2

More Related