1 / 45

Program Evaluation

Program Evaluation. A systematic effort to describe the status of a program Extent to which program objectives achieved. Uses of Health Program Evaluation. Insight - Needs - Barriers - Activities Improvement - Social mobilization - Inter- sectoral Coordination - Implementation

wan
Télécharger la présentation

Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Evaluation • A systematic effort to describe the status of a program • Extent to which program objectives achieved

  2. Uses of Health Program Evaluation Insight - Needs - Barriers - Activities Improvement - Social mobilization - Inter- sectoral Coordination - Implementation - Client Conveniences Affordability Accessibility Availability - Cost - Benefit Outcomes / Impact - Skills development - Behavioral change - Level of success in achieving objectives - Accountability

  3. Types of Evaluation Needs Assessment To identify • Goals • Products • Problems • Conditions

  4. Types of Evaluation Contd… Formative (Process) Evaluation To improve developing or ongoing program Role as helper/advisor/planner Progress in achievements Potential problems/needs for improvements in • Program Management • Inter-sectoral coordination • Social mobilization  Implementation  Outcomes

  5. Types of Evaluation Contd… Summative (Coverage) Evaluation (To help decide ultimate fate) Summary statement about Program’s achievements Unanticipated Out comes Comparison with other programs

  6. Sample Size Factors - Purpose of study - Population size - Level of precision (sampling error) - The confidence / Risk level - Degree of variability - Appropriate for the analysis - Appropriate for comparative analysis of sub groups - Add for non respondents

  7. Sample Size Contd… Strategies - Using a census - small population - Using sample size of a similar study - Using published tables / software - Using formulas

  8. Health Program Evaluation - Quantitative Research Methods Approach Measures the reaction of a great many people to a limited set of questions Comparison and statistical aggregation of the data Broad, generalizable set of findings presented succinctly and parsimoniously.

  9. Health Program Evaluation Qualitative Research Methods • Participant Observation • Key Informant Interviews • Open ended Interview • Focus Group Discussions • Pile sort

  10. Health Program Evaluation - Qualitative Research Methods Contd… Findings - Presented alone / in combination with quantitative data - Validity and reliability depends on methodological skills, sensitivity, integrity of the researchers - Skillful interviewing - more than just asking questions. - Content analysis - more than just reading to see what’s there. - Generate useful and credible findings through observation, interviewing and content analysis How? - Discipline, knowledge, training, practice, creativity, hardwork

  11. Data Processing • Raw field notes should be corrected edited and typed • Tape recordings need to be transcribed and corrected • Texts by field worker should not be changed to make it ‘writable’ or ‘readable’

  12. Data Reduction • Process of selecting, focussing, simplifying, abstracting and transforming data from field notes and transcripts • Researcher retains some data chunks, pulls out some and gets an idea of story to tell

  13. Analysis Steps • Free listing • Domain Evolution • Coding • Tabulation • Summarizing

  14. Quotable Quotes • Give a vivid, meaningful flavor which is far more convincing than pages of summarized numbers • - These should not be distracters • - Should not take the reader away from the real issues in hand

  15. “This is an unprecedented event where all people irrespective of caste, creed and religion take part in PPI program on the same day (NID) throughout the country” • Health worker (150): Burdwan • “He (my husband) told me that everybody is going for polio drops. Then why should we be left out ? After all, everybody is not a fool” • Utilizer (1422): Delhi • “We have not at all immunized our son. My husband was very stubborn. He said ‘those who are immunized are also getting this disease (polio) and whatever happens let it happen’. He has not allowed me to get the child immunized” • Non utilizer (630): Hyderabad

  16. Data Display • This is an organized, compressed assembly of information that permits conclusion drawing and action • Matrices, graphs, charts and networks are used

  17. Qualitative Quantitative Social theory Action Structure Methods Observation, interview Experiment, survey Question What is X? (classification) How many Xs? (enumeration) Reasoning Inductive Deductive Sampling method Theoretical Statistical Strength Validity Reliability Qualitative Vs Quantitative Research - the dichotomy

  18. Qualitative Vs. Quantitative • It is not qualitative Vs. quantitative but qualitative and quantitative • Qualitative methods are rapid, exploratory and hypothesis generating • Can be used as Impact evaluation research • Allow the researcher to palpate the unique cross-cultural features

  19. Multi-Centric Evaluation Studies Issues • Common understanding of the program • Common understanding of aims and objectives of evaluation exercise • Standardization of research instruments • Standardization of protocol implementation at various sites •Regional variation in program implementation

  20. Multi-Centric Evaluation Studies Contd… Steps • Cohesive network of partners • Multi-disciplinary team of investigators • Piloting of instruments at different sites • Development of common understanding • Training of research teams • Multiple layers of quality assurance measures

  21. Network Dynamics Sustaining the Network During active period: • Recognized individual efforts to excel - Co-opted as extended CCT members • Recognized PMC efforts to excel - Made regional centers • Communication Channels (phone, fax, e-mail) • Pilot of instruments (ownership) During interface: • IndiaCLEN / INCLEN meetings, workshops • Feedback on completed reports

  22. Network Dynamics Contd… Quality Assurance Measures • National orientation workshop (PI s) • Regional orientation workshops (PI s, RA s) • PMCs, Regional Centers: Quality checks of interviews, schedules, tapes • Central Coordinating Office: Random checks of recordings, interviewing procedure, transcripts, translations • Regional Coordinators, CCT members: Site visits / Supervision of FGDs • Method triangulation using In-depth interviews, FGDs

  23. Quality Assurance Measures • Development of Interview Schedules • Consistency Checks / VALIDITY • Method Triangulation • Data Triangulation • Data Interpretation • Partner Medical Colleges • Regional CEUs • Central Coordinating Office

  24. Capacity Building National Level Leadership transfer to coordinate MI project - Thiruvananthapuram State Level PMCs upgraded to Regional Centers Upgrading of physical facilities Ten investigators INDEPENDENTLY took up evaluation of national programs at state/district level Network Partners Research - individual/collaborative Resource persons - local/regional/national Extended CCT members

  25. Interaction with Program Managers / Policy Makers Program Evaluation: A partnership exercise • Developing objectives, instruments • Dissemination of Findings • Support other program related activities

  26. Dynamics of Establishing Partnerships with Policy Makers Evaluators express their opinions explicitly - based on evidence gathered - consistent quality assurance measures - limitations of study accepted up front - politics of evaluation - remains a scientific endeavor

  27. Dynamics of Establishing Partnerships with Policy Makers • Results to be presented in a manner which are perceived as VALID, RELIABLE & FEASIBLE TO IMPLEMENT • Working in a strict TIME SCHEDULE for timely fine-tuning of strategies • Program Evaluators have to establish CREDIBILITY with Program Managers

  28. How can Evaluation Data be Used? • Program managers • Redefining aims objectives • Modifying or fine tuning strategies (process) • Sustainability (including fatigue factor) • Judge the worth (impact) • Expense / cost • Interaction with other activities

  29. How can Evaluation Data be Used? • Education • Generalisability • Unique features (success/failure) • Determinants of provider and client behavior

  30. IndiaCLEN Program Evaluation Network Activities

  31. Studies Completed (1997-2001)

  32. Forthcoming Studies

  33. IndiaCLEN Members -1996      Delhi Lucknow Nagpur  Chennai Vellore Thiruvananthapuram

  34. Srinagar • IndiaCLEN Program Evaluation Network - 1997  Delhi • Ballabhgarh  Agra • Jodhpur Dibrugarh Lucknow   • Patna Ahmedabad • Bhopal  Burdwan • • Bilaspur  Nagpur  Berhampur • Mumbai  Gulbarga • Panaji • Tirupati • Bangalore  • Chennai Vellore • • Madurai Calicut  Thiruvananthapuram

  35. Srinagar • IndiaCLEN Program Evaluation Network - 2000 • Kangra  • Delhi Rohtak • Ballabhgarh  • Agra • Jodhpur Dibrugarh • Lucknow   • • • • • Bikaner Jaipur • Darbhanga Kohima Kanpur • Guwahati Kota Gwalior • Agartala Patna Imphal • • Jamnagar • • Bhopal  Burdwan • • Aizwal Ahmedabad Kolkata • Bilaspur •  • Sambhalpur Cuttack Nagpur  Berhampur • Hyderabad • Mumbai  Gulbarga • • Panaji Vijayawada • • Tirupati • Bangalore  • Chennai • Vellore Kannur • • Madurai Calicut  Thiruvananthapuram

  36. Agenda Item No.14- Conduction of Family Health Awareness Campaign A brief (15 minutes) presentation on evaluation of FHAC round 2000 was made by Dr. N.K. Arora, IndiaCLEN, AIIMS. “Addl. Secretary & Project Director (NACO) said that the short comings observed in evaluation of the campaign should be taken into consideration while preparing action plans for the next round of FHAC in the year 2001. After discussion (one hour 15 minutes) with the State Project Directors, it was decided that…” Letter No.T.11014/2/2001-NACO dated 05.07.2001

  37. IndiaCLEN Program Evaluation Network VISION Facilitate development and implementation of people friendly, effective

  38. IndiaCLEN Program Evaluation NetworkInvestigators

  39. Program Evaluations - Relevance to Policy • ACADEMIA can play an important role in influencing the National Policy • - multi disciplinary teams • Evaluations are not done in VACUUM, should be Policy Relevant • - central, state, district level • RECOGNIZE Policy Makers & Other Stakeholders as partners

  40. Models of Program Evaluation • • Goal oriented evaluation • Aimed to assess the progress and the effectiveness of innovations/ interventions. • • Decision oriented evaluation • Aimed to facilitate intelligent judgements by decision makers. • • Responsive evaluation • Aimed to depict program process and the value perspectives of key people. • • Evaluation Research • Focused on explaining effects, identifying causes of effects, and generating generalizations about program effectiveness. • • Goal free evaluation • To assess program effects based on criteria apart from the program’s own conceptual framework, especially on the extent to which real client needs are met. • • Advocacy - adversary evaluation • Evaluation should derive from the argumentation of contrasting points of view. • • Utilization - oriented evaluation • Structured to maximize the utilization of its findings by specific stakeholders and users.

  41. Design Effect Ratio of variance with cluster sampling to variance with simple random sampling Var simple random sampling = p(1 - p) n Var cluster sampling = (pi - p)2 K(k-1) Design effect =  (pi - p)2 n k(k-1) p(1-p)

  42. Health Program Evaluation - Quantitative Research Methods Approach - Measures the reaction of a great many people to a limited set of questions - Comparison and statistical aggregation of the data - Broad, generalizable set of findings presented succinctly and parsimoniously.

  43. Summary Qualitative methods aim to make sense of, or interpret, phenomena in terms of the meanings people bring to them Qualitative research may define preliminary questions which can then be addressed in quantitative studies A good qualitative study will address a clinical problem through a clearly formulated question and using more than one research method (triangulation) Analysis of qualitative data can and should be done using explicit, systematic, and reproducible methods

  44. Development of Program Objectives (Program Evaluators) • Lessons of success & failure • Wider application of program strategies • Determinants of client behavior • Impact on other health systems • [national & international interest in later part]

More Related