1 / 40

Evaluation Systems: Context, Issues, and Perspectives

Evaluation Systems: Context, Issues, and Perspectives. Charles Igel Senior Researcher, McREL Brian Ewert Superintendent, Englewood Public Schools Barbara Lunsford Associate Superintendent, Georgia DOE

maren
Télécharger la présentation

Evaluation Systems: Context, Issues, and Perspectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Systems: Context, Issues, and Perspectives Charles Igel Senior Researcher, McREL Brian Ewert Superintendent, Englewood Public Schools Barbara Lunsford Associate Superintendent, Georgia DOE Kerry Englert President, Seneca Consulting Jessica Allen Researcher, McREL Tony Davis Senior Director, McREL

  2. Framework for Evaluation Systems Teacher Effectiveness Quality Standards Leadership Environment Content Learning Practice Growth 50% Professional Practice 50% Performance Ratings Developing Proficient Accomplished Distinguished

  3. Examples of large-scale evaluation initiatives

  4. Considerations for developing and rolling-out educator evaluation systems

  5. Colorado SCEE Example Human Judgment: Data should inform decisions, but human judgment will always be an essential component of evaluations Continuous Improvement: The implementation and assessment of the evaluation system must embody continuous improvement Feedback: The purpose of the system is to provide meaningful and credible feedback that improves performance Alignment: Educator evaluation must take place within a larger system that is aligned and supportive State Council for Educator Effectiveness. (2011, May). State Council for Educator Effectiveness report and recommendations. Denver, CO. Available at www.cde.state.co.us/EducatorEffectiveness

  6. Additional considerations: Educational Leadership Policy Standards (ISLLC 2008) S1: Facilitating the development and implementation of a shared vision for learning S2: Nurturing a culture conducive to student learning and professional growth S3: Managing for a safe, efficient, and effective learning environment S4: Collaborating with and mobilizing faculty and community resources S5: Acting in a fair and ethical manner S6:Responding to and influencing the political, legal, and cultural environment

  7. Additional considerations: InTASC Model Core Teaching Standards S1: Uses of developmentally appropriate instruction S2: Establishes inclusive learning environments S3: Promotes positive peer interactions and engagement with material S4: Understands content and makes it meaningful and accessible to all learners S5: Connects content in ways that promote creativity andcritical thinking S6: Uses multiple methods of assessment S7: Uses all professional knowledge in lesson planning S8: Uses multiple instructional strategies S9: Active in own professional growth S10: Take appropriate leadership positions and actively engages in the profession

  8. Additional considerations: Educational Data Exchange Network (EDEN) Reporting What is reported? 1) Teacher counts categorized by performance level 2) Total number of teacher evaluated 3) Principal counts categorized by performance level 4) Total number of principals evaluated Guidance on – 1) teachers and/or principals assigned to multiple schools in the same LEA 2) teachers and/or principals assigned to multiple schools in different LEAs

  9. Charles Igel, PhD Mid-continent Research for Education & Learning (McREL) cigel@mcrel.org 303.632.5574

  10. THE Sinking of the Titanic

  11. Purpose of Evaluation • Serves as an accountability and compliance system • Serves as a tool to help in employment decisions • Provides a set of rigorous research-based standards • Serves as a measurement of performance and effectiveness • Promotes leadership • Serves as a personal, reflective tool to become more effective • Serves as the basis of instructional improvement (and school improvement) • Enhances the implementation of the approved curriculum • Guides/drives continuous improvement through professional development • Serves as as a tool in a coaching/mentoring model • Serves as a tool to determine performance pay (in some places) • Informs higher education & research

  12. Englewood Schools: ICEBERG DEAD AHEAD! • Lack of Expectations & Nonnegotiables • Lack of Accountability • Lack of Transparency • Lack of Quality • Lack of Fidelity • Lack of Consistency • Lack of Reliability & Validity • Lack of Integrity

  13. ICEBERG DEAD AHEAD! • Lack of Reflective Practice • Deficit/Punitive Model vs. Continuous Improvement Model • Lack of Performance Pathways for Growth • Lack of Policies, Procedures, and Agreed Upon Practices • Lack of a Framework to Drive Conversation • Lack of Goal Setting and Professional Growth

  14. The Lifeboat • Teacher/Principal Effectiveness Bill • Creates a four-tiered system to evaluate the effectiveness of licensed personnel as a means of improving the quality of education in Colorado. The basic purposes of the statewide system are: • To ensure that all licensed personnel are evaluated using multiple, fair, transparent, timely, rigorous, and valid methods, fifty percent of which evaluation is determined by the academic growth of their students • To ensure that all licensed personnel receive adequate feedback and professional development support to provide them a meaningful opportunity to improve their effectiveness • To ensure that all licensed personnel are provided the means to share effective practices with other educators throughout the state

  15. The Lifeboat • Solidarity & Alignment: Board, Administrators, Principals, & Union • Non-negotiables and Expectations • Accountability: 100% Participation • Coaching/Collaborative/Reflective Model • Clearly Defined Framework with Common Language • Low Stakes Implementation with: Quality, Fidelity, Intensity, Consistency

  16. The LIfeboat • 21st Century Instructional Model • State and Common Core Standards • Data Teams • Tied to School & District Improvement Plans • Performance Pathways for Growth • Aligned, Collaborative Goal Setting • Professional Development Based on Data

  17. RMS Carpathia &Pier 54-NYC • Much Work To Do! • Robust, Reliable, Intuitive, Flexible, & Easy-to-Use Technology & Web Tools • Seamless Integration with Other Initiatives • 8:1 Ratio • Peer Coaching and Evaluation • Quality-Fidelity-Intensity-Consistency-Reliability-Validity

  18. RMS Carpathia &Pier 54-NYC • Formative and Summative Observation & Evaluation Data • Deep, Meaningful Conversations around Teacher Standards, Elements, Rubrics, & Ratings • Real Time, Needs-Based Professional Development • Value-Added, Student Growth Measures • Valid, Reliable,Vetted Metrics & Algorithms • Group/Collective Performance Pay • Employment Decisions

  19. Georgia Department of Education CLASS Keys Classroom Analysis of State Standards A Teacher Evaluation Process Standards Barbara Lunsford Associate Superintendent School Improvement

  20. CLASS KeysSM Strands and Elements

  21. STRAND STANDARD ELEMENT ASSESSMENT - The collecting and analyzing of student performance data to identify patterns of achievement and underachievement in order to design and implement appropriate instructional interventions. CONTINUUM OF IMPROVEMENT RUBRIC

  22. Performance on the elements of the CLASS KeysSM is identified on a four-level continuum. This continuum is not utilized to label teachers as Not Evident, Emerging, Proficient, or Exemplary. The continuum is used to describe a teacher’s PERFORMANCE on specific elements.

  23. An array of evidence is collected from multiple sources during the year. Announced, Longer Observations Unannounced, Short Observations Annual Evaluation Other artifacts and evidence from conferences, meetings, review of teacher and student products GTDR Performance Student Achievement Data

  24. CLASS KeysSM Modules Module 1: Content and Structure (The What?) Module 2: Overview of the Evaluation Process (The How?) Module 3:Self-Assessment and Reflection Module 4: Professional Growth Plan Module 5: Pre-Evaluation Conference Module 6:Informal Observations Module 7: Formal Observations Module 8: Georgia Teacher Duties and Responsibilities Module 9: Annual Evaluation Module 10: Professional Development Plan CLASS KeysSM Modules

  25. Evaluator Training Assessment • Three Sections: • Scoring the Elements in a Classroom Video • Annual Evaluation Performance Task • Technical Questions Regarding Process

  26. Validity in a Standards-Based Teacher Evaluation System Kerry Englert, Ph.D. Seneca Consulting Jessica Allen, MA McREL Council of Chief State School officers National conference on student assessment June 19th, 2011

  27. Presentation Purposes • Overview of our experiences conducting an initial validity study of a state’s standards-based teacher evaluation system. • Propose a framework that can be used when designing these validation studies. • Offer suggestions and considerations for validation studies of educator evaluation systems.

  28. National Issues in Teacher Effectiveness • Education Research • Connection between teacher effectiveness and student learning that is well documented (ex. Goe (2007) ). • Education Policy • Focusing on teacher evaluation and the relationship between teacher effectiveness and student achievement. • Race to the Top • White House’s Blue Print for Reform

  29. Standards-Based Evaluation System Three core components: • A set of standards. • Procedures for collecting performance data on the standards. • Rubrics to evaluate performance. Optional fourth component: • Methodology for incorporating evaluation results into personnel decisions. * from (Odden, 2004):

  30. Why do a validity study? • Race to the Top • Requires a valid and reliable teacher evaluation instruments. • Validity evidence • Builds credibility to the instrument and evaluation system. • Provides information on data use and implementation • Provides information on continual development. • Can focus training efforts.

  31. CLASS KeysSM Teacher Evaluation System • Purposes • That serves a two-fold purpose improvement and accountability • Core Components • Five major strands • Training materials • 28 performance rubrics.

  32. Validity Study of CLASS KeysSM System • Study Context • Quick turn around time • Extant data from pilot study • Readily available and accessible information • Broad examination of the literature • Inform, focus, and guide the study • Better capitalize on the available data

  33. Relevant Literature • Validity Theory and Standards • A unified theory and system uses (Standards for Educational and Psychological Testing, 1999; Messick, 1989) • A strong theory of action (Kane, 2006) • Personnel Evaluation  • Applying the concepts of fairness, transparency, efficiency and practicality to the personnel evaluation context (Gullickson, 2008) • Quality Rubrics • Characteristics of quality rubrics (Arter& McTighe, 2001) • Implementation Fidelity • The degree to which the system is being implemented according to the specification (Ruiz-Primo, 2006)

  34. Potential framework for validity in Educator Evaluation Validity in Educator Evaluation

  35. Methods related to the framework • Validity Theory and Rubric Quality • Content • Validity Theory • Construct • Personnel Evaluation Standards and Validity Theory • Fairness and Reliability • Implementation Fidelity and Validity Theory • Use and Interpretation

  36. Future Considerations • Validation of educator evaluation systems might benefit from multiple perspectives. • Pilot and field tests are opportunities not only to test the instrument but apply validity methodologies and instruments. • Validation is an ongoing process and provides a framework to understand how the instrument is being used.

  37. References • American Psychological Association, American Educational Research Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological tests and manuals. Washington, D.C.: American Psychological Association. • Arter, J., & McTighe, J. (2001). Scoring rubrics in the classroom: Using performance criteria for assessing and improving student performance. Thousand Oaks, CA: Corwin. • Goe, L. (2007). The link between teacher quality and student outcomes: A research synthesis. National Comprehensive Center for Teacher Quality. Retrieved from http://www.tqsource.org/publications/LinkBetweenTQandStudentOutcomes.pdf • Gullickson, A. R. (2008). The Personnel Evaluation Standards: How to Assess Systems for Evaluating Educators, 2nd edition. Joint Committee on the Standards for Educational Evaluation. Corwin Press, Thousand Oakes, CA. • Kane, M. T. (2006). Validity. In R. L. Brennen (Ed.), Educational Measurement (Vol. 4th edition). Westport, CT: Praeger Publishers. • Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational Measurement (3rd ed., pp. 13-103). New York: American Council on Education & McMillian. • Odden, A. (2004). Lessons learned about standards-based teacher evaluation systems. Peabody Journal of Education, 79(4), 126-137. • Ruiz-Primo, M. A. (2006). A multi-method and multi-source approach for studying fidelity of implementation. National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

More Related