1 / 18

Leadership for Innovative Omani Schools in the 21st Century

Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan. Leadership for Innovative Omani Schools in the 21st Century. Program Evaluation. Program Evaluation

Télécharger la présentation

Leadership for Innovative Omani Schools in the 21st Century

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools in the 21st Century

  2. Program Evaluation Program Evaluation “The systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming”. Michael Q. Patton 1

  3. Program Evaluation Program Evaluation “The systematic assessment of the operation and/or the outcomes of a program or policy, compared to a set of explicit or implicit standards, as a means of contributing to the improvement of the program or policy”. Carol Weiss 2

  4. Purposes of Evaluation Formative: Provide feedback to make improvements Summative: Provide feedback to make judgments about worth/success

  5. Program Evaluation "When the cook tastes the soup, that’s formative; when the guests taste the soup, that’s summative." Robert Stake 3

  6. What Program Evaluation IS Program Evaluation • Is conducted for specific users for specific uses • Does involve individuals or groups of people who care about the results. In evaluation, those persons are identified so that their needs can be addressed • Is an applied, practical activity

  7. What Program Evaluation IS NOT Program Evaluation • Program evaluation is not the same as research. • It is a different type of inquiry and requires more skills. • Program evaluation is not something that must be conducted by an outside “expert”. • Those involved in the program may design an evaluation that will be more useful to them in making decisions about improvement.

  8. What Program Evaluation IS NOT (continued) Program Evaluation • Program evaluation is not about “proving” that something is or is not working. The two major purposes are to either: 1) improve a program, or 2) determine to what extent the desired outcomes of a program were met.

  9. Standards for Evaluation4 Utility: Must serve the practical information needs of the intended users. Feasibility: Must be realistic, prudent, diplomatic and frugal. Propriety: Must be conducted legally and ethically, and with due regard to those involved in the evaluation and those affected by its results. Accuracy: Must reveal and report accurate information about the evaluated program.

  10. Program Evaluation Steps Step 1: What will be evaluated? • What is the object of the evaluation • Which part of the program will be evaluated? • Resources? • Processes? • Outcomes?

  11. Program Evaluation Steps (continued) Step 2: Who wants the information and why? • Who are the stakeholders? • How do they intend to use the information?

  12. Program Evaluation Steps (continued) Step 3: What questions can be answered by the evaluation?

  13. Program Evaluation Steps (continued) Step 4: How will the information be gathered? • For each evaluation question: • What information is needed? • What is the source of each identified piece of information? • By what method will each piece of information be gathered?

  14. Program Evaluation Steps (continued) • Information Storage • computer or written records • data and field note files • Information Security • protection from unauthorized use • issues of anonymity Step 5: How will information collection be managed? • Information Accuracy • – accurate coding – verification • – data storage

  15. Program Evaluation Steps (continued) Step 6: How will the information be analyzed? • How will the data (quantitative and qualitative) be analyzed? • How will the results be interpreted? • Who will interpret the results?

  16. Program Evaluation Steps (continued) Step 7: How will the information be reported and used? • What are the components of a report? • How should the reports for various audiences differ?

  17. Program Evaluation Steps (continued) Step 8: How should an evaluation be managed? • Timeline • Clear responsibilities • Budget • Human subject plan • Adherence to evaluation standards

  18. Program Evaluation Steps (continued) Step 9: What can be learned from this evaluation? • Evaluate the evaluation using the program evaluation standards to: • Document problems and areas for possible improvements • Review methods and instruments for possible improvements • Look for unintended consequences of the evaluation

More Related