1 / 14

SOWK 6003 Social Work Research Week 7 Designs for Evaluating Programmes and Practice – Experimental designs

SOWK 6003 Social Work Research Week 7 Designs for Evaluating Programmes and Practice – Experimental designs. By Dr. Paul Wong. Purposes of Research. Exploration Description Explanation Evaluation Multiple purposes.

iden
Télécharger la présentation

SOWK 6003 Social Work Research Week 7 Designs for Evaluating Programmes and Practice – Experimental designs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SOWK 6003 Social Work ResearchWeek 7 Designs for Evaluating Programmes and Practice – Experimental designs By Dr. Paul Wong

  2. Purposes of Research • Exploration • Description • Explanation • Evaluation • Multiple purposes

  3. Program evaluation refers to the purpose of research rather than to any specific research methods.

  4. Purposes of program evaluation 1) to assess the ultimate success of programs; 2) to assess problems in how programs are being implemented; or 3) to obtain information needed in program planning and development

  5. 1990s in the US: program evaluation as a prerequisite for approving grant applications

  6. Two major types of evaluations Summative – are concerned with assessing the success of programs Formative – are concerned on obtaining information that is helpful in planning the program and in improving its implementation and performance

  7. Politics The evaluation, itself, can be dealt with quite straightforwardly. However, the POLITICS of it can be very difficult to handle.

  8. Some common obstacles • In-house vs. External evaluators • Utilization of the findings: • The implications may not always be presented in a way that nonresearchers can understand; • Evaluation results sometimes contradict deeply held beliefs. • Logistical and Administrative problems: • Learning as much as possible about the stakeholders; • Maintaining ongoing mutual feedback between them and the evaluator; • Tailoring the evaluation and its reportage to their needs and preferences as much as possible without sacrificing scientific objectivity

  9. Types of Program Evaluation • Evaluating outcome and efficiency: • should strive to enhance causal inference by using the most internally valid experimental or quasi-experimental design possible • The assessment of efficiency asks whether program outcomes are being achieved at a reasonable cost and applies the principles of cost accounting to calculate the ratio of program benefits to costs

  10. 2) Process Evaluation – aims to investigate the “means” , not the “ends”, and focus on identifying strengths and weaknesses in program processes and recommending needed improvements e.g., (pp.323) • What types of individuals are not being reached by the service? • Are clients satisfied with services? Why and why not? • Tends to rely heavily on qualitative methods

  11. 3) Needs assessment • A diagnostic evaluation • Common techniques: • Key informants • Community forum • Rate under treatment approach • Social indicator • Community survey • Focus groups

  12. Randomized Controlled Trial – What is it?

  13. In class activities Break into small groups and design program evaluation studies regarding this class. How would they assess the need for it, its implementation, and its outcome in attaining its goals? Articles Critique: Cheung Chau Suicide Prevention Programme A local evaluation study using a RCT methodology

More Related