1 / 29

EVAL 6000: Foundations of Evaluation

EVAL 6000: Foundations of Evaluation. Carl D. Westine & Dr. Chris L. S. Coryn December 2, 2010. Agenda. The Program Evaluation Standards (45 min) Overview of JCSEE and the standards The “new” 30 standards 2 nd Edition vs. 3 rd Edition Research on overlaps across standards

idania
Télécharger la présentation

EVAL 6000: Foundations of Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EVAL 6000: Foundations of Evaluation Carl D. Westine & Dr. Chris L. S. Coryn December 2, 2010

  2. Agenda • The Program Evaluation Standards (45 min) • Overview of JCSEE and the standards • The “new” 30 standards • 2nd Edition vs. 3rd Edition • Research on overlaps across standards • Activity on 3rd Edition • Sufflebeam and Coryn 2010 (10 min) • Stufflebeam 2001 (10 min) • Break (15 min) • Activity: Hard Won Lessons (60 min)

  3. The JCSEE • The Joint Committee for Standards on Educational Evaluation was created in 1975, and currently oversees the maintenance and updates to the Program Evaluation Standards • Made up of representatives from 12-17 professional societies • WMU folks involved in the JCSEE: Daniel Stufflebeam, James Sanders, Arlen Gullickson • Currently Daniela Schröter is becoming more involved with the revision process (our point of contact)

  4. What are Standards • JCSEE defined an evaluation standard as a “principle commonly agreed to by experts in the conduct and use of evaluation, that when implemented will lead to greater evaluation quality” (JCSEE, 2010, p. 292) • Focus on North America but other countries have been ada/opting these • Other standards exist (GAO, Guiding Principles, etc.) • For educational programs but… education is relied upon by everything

  5. Early Standards • The first edition was published in 1981. • Contained 30 standards • Defined the groups (Utility, Feasibility, Propriety, and Accuracy) • Spearheaded by Dan Stufflebeam (WMU) and a collection of education and evaluation experts

  6. Second Edition Standards • The second edition was published in 1994 • Contained 30 standards • Maintained same group structure • Achieved status of being accredited as official standards though the American National Standards Institute (ANSI) • Update led by James Sanders (WMU)

  7. Current Happenings • Transition from the 2nd Edition to the 3rd Edition • Third Edition is being published, and copies can be ordered through the JCSEE website:www.jcsee.org • Don Yarrborough at the University of Iowa oversees the JCSEE and the development of the newest standards

  8. The Latest Edition • 3rd Edition • Took more than 5 years to develop and finalize (several delays in publishing) • The next update has already begun • JCSEE continuously takes suggestions and feedback from users • Goal to update standards more frequently

  9. Organization of the PES Book • Applying the Standards • Functional Table of Standards • Shows which standards are relevant at points along the evaluation continuum • Standards • Group Overview/Scenario • Standard Statements • Rationale/Clarification (Overview) • Implementing (Guidelines) • Hazards (Common Errors) • Application(s) • Documentation

  10. The Program Evaluation Standards • The Program Evaluation Standards for educational evaluations • Should be used for: • Guiding an evaluation effort (formative) • Assessing the quality of educational programs (summative) • Assessing the quality of an evaluation of an educational program (metaevaluation) • A tool to help policy makers understand evaluations • [Research on evaluation] (in my own opinion)

  11. The Program Evaluation Standards • There are now 5 categories totaling 30 individual standards • Designed to ensure that an evaluation will… • Utility (7 -> 8): “… serve the information needs of intended users.” • Feasibility (3 -> 4): “… be realistic, prudent, diplomatic, and frugal.” • Propriety (8 -> 7): “… be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results.” • Accuracy (12 -> 8): “… reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated.” • Evaluation Accountability (0 -> 3): be well documented and held subject to internal and external evaluation (JCSEE, 2010).

  12. Utility Standards • U1 Evaluator Credibility Evaluations should be conducted by qualified people who establish and maintain credibility in the evaluation context. • U2 Attention to Stakeholders Evaluations should devote attention to the full range of individuals and groups invested in the program and affected by its evaluation. • U3 Negotiated Purposes Evaluation purposes should be identified and continually negotiated based on the needs of stakeholders. • U4 Explicit Values Evaluations should clarify and specify the individual and cultural values underpinning purposes, processes, and judgments.

  13. Utility Standards (cont.) • U5 Relevant Information Evaluation information should serve the identified and emergent needs of stakeholders. • U6 Meaningful Processes and Products Evaluations should construct activities, descriptions, and judgments in ways that encourage participants to rediscover, reinterpret, or revise their understandings and behaviors. • U7 Timely and Appropriate Communicating and Reporting Evaluations should attend to the continuing information needs of their multiple audiences. • U8 Concern for Consequences and Influence Evaluations should promote responsible and adaptive use while guarding against unintended negative consequences and misuse.

  14. Feasibility Standards • F1 Project Management Evaluations should use effective project management strategies. • F2 Practical Procedures Evaluation procedures should be practical and responsive to the way the program operates. • F3 Contextual Viability Evaluations should recognize, monitor, and balance the cultural and political interests and needs of individuals and groups. • F4 Resource Use Evaluations should use resources effectively and efficiently.

  15. Propriety Standards • P1 Responsive and Inclusive Orientation Evaluations should be responsive to stakeholders and their communities. • P2 Formal Agreements Evaluation agreements should be negotiated to make obligations explicit and take into account the needs, expectations, and cultural contexts of clients and other stakeholders. • P3 Human Rights and Respect Evaluations should be designed and conducted to protect human and legal rights and maintain the dignity of participants and other stakeholders. • P4 Clarity and Fairness Evaluations should be understandable and fair in addressing stakeholder needs and purposes.

  16. Propriety Standards (cont.) • P5 Transparency and Disclosure Evaluations should provide complete descriptions of findings, limitations, and conclusions to all stakeholders, unless doing so would violate legal and propriety obligations. • P6 Conflicts of Interests Evaluations should openly and honestly identify and address real or perceived conflicts of interests that may compromise the evaluation. • P7 Fiscal Responsibility Evaluations should account for all expended resources and comply with sound fiscal procedures and processes.

  17. Accuracy Standards • A1 Justified Conclusions and Decisions Evaluation conclusions and decisions should be explicitly justified in the cultures and contexts where they have consequences. • A2 Valid Information Evaluation information should serve the intended purposes and support valid interpretations. • A3 Reliable Information Evaluation procedures should yield sufficiently dependable and consistent information for the intended uses. • A4 Explicit Program and Context Descriptions Evaluations should document programs and their contexts with appropriate detail and scope for the evaluation purposes.

  18. Accuracy Standards (cont.) • A5 Information Management Evaluations should employ systematic information collection, review, verification, and storage methods. • A6 Sound Designs and Analyses Evaluations should employ technically adequate designs and analyses that are appropriate for the evaluation purposes. • A7 Explicit Evaluation Reasoning Evaluation reasoning leading from information and analyses to findings, interpretations, conclusions, and judgments should be clearly and completely documented. • A8 Communication and Reporting Evaluation communications should have adequate scope and guard against misconceptions, biases, distortions, and errors.

  19. Evaluation Accountability Standards • E1 Evaluation Documentation Evaluations should fully document their negotiated purposes and implemented designs, procedures, data, and outcomes. • E2 Internal MetaevaluationEvaluators should use these and other applicable standards to examine the accountability of the evaluation design, procedures employed, information collected, and outcomes. • E3 External MetaevaluationProgram evaluation sponsors, clients, evaluators, and other stakeholders should encourage the conduct of external metaevaluations using these and other applicable standards.

  20. Utility (Old vs. New) -1+2=+1

  21. Feasibility (Old vs. New) +1

  22. Propriety (Old vs. New) -1

  23. Accuracy (Old vs. New) -3+1=-2

  24. Evaluation Accountability (Old vs. New) +1

  25. Significant Changes • Managing focus (Process Use) • Project • Information • Less emphasis on “actionable” statements • Evaluation Accountability • New standard group • Transparency/documentation • Metaevaluation

  26. Significant Changes (cont.) • Combining of Standards • Quant/Qual into Design • Human Rights/Interactions into Respect • Clarity/Timeliness/Dissemination into Appropriate Communication • Expansion of the drawn out examples/scenarios

  27. Using the Standards • Three principles to guide the use of the Program Evaluation Standards • Standards require adaptive, responsive, and mindful use • The user must discover how to apply them in each specific situation • Order of the Standard groups does not matter (contrary to what others say: Utilization-Focused, Accuracy) • In depth knowledge is required • Don’t just read the individual standards

  28. Overlaps Across the Standards • Functional Table of Standards begins the process of identifying standards that are related to each other a specific points in the evaluation process • How should we identify overlaps (dependency relationships) among the standards? • Could this enhance metaevaluation efficiency? • Do specific standards have more significance than others?

  29. Activity • Four Groups: Utility, Feasibility/Evaluation Accountablity, Propriety, and Accuracy. • Identify the keywords in the standard statements. Example: U1 Evaluator Credibility “Evaluations should be conducted by qualifiedpeople who establish and maintain credibility in the evaluation context.”

More Related