1 / 30

Class Meeting #5

Class Meeting #5. Program Evaluation Methods I: Needs Assessment, Formative Evaluation, Process Evaluation, and Single System Research Designs (SSRDs). Types of Program Evaluations.

tsousa
Télécharger la présentation

Class Meeting #5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Class Meeting #5 Program Evaluation Methods I: Needs Assessment, Formative Evaluation, Process Evaluation, and Single System Research Designs (SSRDs)

  2. Types of Program Evaluations “The type of evaluation you undertake to improve your programs depends on what you want to learn about the program.” McNamara, C. (1998). “Basic Guide to Program Evaluation.” Downloaded 1/5/06 from http://www.managementhelp.org/evaluatn/fnl_eval.htm.

  3. Many Possibilities • Needs assessments • Accreditation • Cost/benefit analysis • Effectiveness • Efficiency • Goal-based • Process • Outcomes

  4. Definitions • Inputs: resources needed to run the program • Process: how the program is carried out • Outputs: units of service • Outcomes: impacts on the customers or clients McNamara, C. (1998). “Basic Guide to Program Evaluation.” Downloaded 1/5/06 from http://www.managementhelp.org/evaluatn/fnl_eval.htm.

  5. Definition Needs Assessment: Decision-aiding tools used for resource allocation, program planning, and program development based on the assumption that planned programming can alleviate distress and aid growth. McKillip

  6. Four Types of Need • Normative – defined by an expert • Felt – ascertained by asking clients • Expressed – a demand for services • Comparative – inferred from similar characteristics

  7. Approaches • Secondary data analysis • Impressionistic approach • Nominal groups • Delphi technique • Focus groups • Surveys • Convergent analysis • Multimethod approach

  8. Considerations • Budget • Resources available • Amount of time to complete the evaluation • One-time or continuing process • Amount of detail or information desired

  9. Persons of Interest • Sponsor: Agency and/or individual that authorizes the evaluation and provides necessary fiscal resources • Stakeholders: Various individuals and/or groups who have a direct interest in and may be affected by the program being evaluated or the evaluation results • Clients: Individuals who are receiving services by or from the program • Audience: Individuals, groups, and/or agencies who have an interest in the evaluation and receive its results

  10. Definitions • Social indicators – variables that help gauge the extent of social problems • Ecological fallacy – misinterpretation of data derived from social indicators • Key informants – those who are informed about a given problem because of training or work experience and are willing to share their knowledge

  11. Formative Evaluation • Purpose is to adjust and enhance interventions • In-progress examination of the program or intervention • No specific methodology or procedures (flexible implementation)

  12. What it involves... “Formative Evaluation typically involves gathering information during the early stages of your project or program, with a focus on finding out whether your efforts are unfolding as planned, uncovering any obstacles, barriers or unexpected opportunities that may have emerged, and identifying mid-course adjustments and corrections which can help insure the success of your work.” NWREL http://www.nwrel.org/evaluation/formative.shtml

  13. Approaches • Compare the program to model standards or similar standards • Bring in an “expert” to observe and give suggestions • Form an “ad-hoc” committee http://www.google.com/ “standards for good child care programs”

  14. Possible Questions • How were program goals established? • What is the status of the program’s progress toward achieving the goals? • Will the goals be achieved according to the timelines specified in the operational plan? • Do personnel have adequate resources? • What changes are needed?

  15. Process Evaluation • Purpose is to understand how a program works and possibly reveal its strengths and weaknesses • Very useful for programs that have been operating for a long time • Possible focuses: • Program description • Program monitoring • Quality assurance

  16. A Rose by any other name... Process evaluations can go by numerous other names... http://www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf

  17. Possible Questions • How are employees trained? • How do customers or clients come into the program? • What is the general process that customers or clients go through with the product or program? • What do customers or clients consider to be strengths of the program? • What typical complaints are heard from employees and/or customers?

  18. Program Description • Clarify the purpose • Develop a data collection plan • Identify who will be interviewed • Develop instruments • Conduct interviews • Examine documents and records • Analyze and integrate information • Prepare the report

  19. Program Monitoring • Start with the goals and objectives • Compare them to data that is routinely collected • Patterns of use • Client utilization

  20. Management Information Systems • “The Old-Fashioned Way” (paper and pencil) • Electronic data management • Manual entry • Electronic “dumps” (Most school systems now have MISs and test scores are electronically “dumped” into the system.)

  21. Knowing What to Count! For guidance, we go first to the organization’s: • Mission statement • Goals • Objectives

  22. Mission Statement An organization’s statement of purpose, which provides a common vision for all stakeholders and a point of reference for all major planning decisions. http://www.pgcps.org/docs/mission.pdf

  23. Goals General statements that specify an organization’s direction based on values, ideals, political mandates, and program purpose. Objectives Specific and precise statements that describe what is to be accomplished and by what date. Objectives have a single aim and end product. Goals and Objectives

  24. Writing Objectives • Verb • Specific target • Date

  25. Difficult to Measure Help Assist Understand Know Realize Discover Improve Measurable Increase Add Decrease Reduce Advertise Publicize Start Create Measurability

  26. Quality Assurance • Determining compliance with standards • Most human services agencies and programs have standards • “Utilization Review” • Is this program evaluation? http://www.iso.org/

  27. Sample Standards National Association for the Education of Young Children Accreditation Performance Criteria http://www.naeyc.org/accreditation/performance_criteria/

  28. Parallel in School Systems • Special Education • Diagnostic standards • Provision of services standards • Reimbursement from Medicaid

  29. Basic Steps for SSRDs • Assess student (client) behavior • Operationalize the behavior(s) • Develop measurement tools • Measure behavior(s) • Choose design • Collect data • Analyze data

  30. Operationalize = Choose Variables • There are variables that predict things we cannot control • There are variables that predict things we can control • There are variables that do not predict Interventions should focus on variables that predict things we can control!

More Related