1 / 122

Educator Evaluation System

Educator Evaluation System. Peabody Public Schools June 21, 2012. RPS Educator Evaluation Wiki. Wiki with Resources http://rpseducatorevaluation.wikispaces.com/. Let’s Take a Few Minutes.

arnon
Télécharger la présentation

Educator Evaluation System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Educator Evaluation System Peabody Public Schools June 21, 2012

  2. RPS Educator Evaluation Wiki • Wiki with Resources • http://rpseducatorevaluation.wikispaces.com/

  3. Let’s Take a Few Minutes • Take a few minutes to write down any burning questions that you may have in relation to the evaluation process

  4. Burning Questions

  5. Burning Questions

  6. Burning Questions

  7. ValveHandbook for New Employees Risks (What if I screw up?) “Nobody has ever been fired at Valve for making a mistake. It wouldn’t even make sense for us to operate that way. Providing the freedom to fail is an important trait of the company-we couldn’t expect so much of individuals if we also penalized people for errors. Even expensive mistakes, or ones which result in a very public failure, are genuinely looked at as opportunities to learn. We can always repair the mistake or make up for it.

  8. Valve (Continued) “Screwing up is a great way to find out that your assumptions were wrong or that your model of the world was a little bit off. As long as you update your model and move forward with a better picture, you’re doing it right. Look for ways to test your beliefs. Never be afraid to run an experiment or collect more data. It helps to make predictions and anticipate nasty outcomes. Ask yourself “what would I expect to see if I’m right?” As yourself “What would I expect to see if I’m wrong?” Then ask yourself, “what do I see?” If something totally unexpected happens, try to figure out why.”

  9. Valve “There are still some bad ways to fail. Repeating the same mistake over and over is one. Not listening to customers or peers before or after failure is another. Never ignore the evidence; particularly when it says you’re wrong.”

  10. Agenda • Discussion of Educator Evaluation Regulations • Engaging Educators in the Process • SMART Goal Development • Thoughts from an Early Adopter • Questions Feel Free to Ask Questions Throughout the Workshop

  11. Every Beginning is Difficult • http://www.youtube.com/watch?v=pQHX-SjgQvQ

  12. Educator Evaluation Model System http://www.doe.mass.edu/edeval/

  13. Educator Evaluation • New DESE Regulations approved on June 28, 2011 • Collaboratively Designed by • Massachusetts Teachers Association • Massachusetts Association of Secondary School Principals • Massachusetts Elementary School Principals Association • Massachusetts Association of School Superintendents • Department of Elementary and Secondary Education • Requires evaluation of all educators on a license • Designed to promote leaders and teachers growth and development • Designed to support and inspire excellent practice

  14. Reading is an Early Adopter • Our current system is comparable to new DESE model • Allowed us to give significant input into the process • Developed a network with other school districts • Attended professional development opportunities • Piloted • Educator Plan with SMART Goals • Superintendent’s Evaluation Process • Principal Evaluation Process

  15. TAP CommitteeA Key to the Process • Committee of Teachers, Building Administrators, Central Office Administrators • Representation from every school • Compared current rubric with model rubric system • Reviewed model contract language • Will be involved in development of forms for September, 2012

  16. Components of System • Focuses on Educator Growth and not “Gotcha” • Educators are partners in the process • Five Step Evaluation Cycle • Self-Assessment • Analysis, Goal Setting, Educator Plan Development • Implementation of Plan • Formative Assessment (Midyear or Mid-cycle) • Summative Evaluation (End of Year/Cycle Evaluation) • Rubric for Evaluation • Use of Artifacts for Evidence • Lesson Plans, Professional Development Activities, Fliers • Walkthroughs • Announced and Unannounced observations • Differentiated Approach • New Teachers • Non-PTS Teachers • PTS Teachers • PTS Teachers who need additional support • Use of SMART Goals

  17. Components of System • Levels of Performance on Rubric • Exemplary • Proficient • Needs Improvement • Unsatisfactory • Specificity of Rubric • Standards • Indicators • Elements • Four Standards • Multiple Measures of Student Performance (2013-14 School Year) • Use of student surveys (2014-15 School Year)

  18. Continuous Learning 5 Step Evaluation Cycle • Every educator is an active participant in an evaluation • Process promotes collaboration and continuous learning • Foundation for the Model Massachusetts Department of Elementary and Secondary Education

  19. 5 Step Evaluation Cycle: Rubrics Every educator uses a rubric to self-assess against Performance Standards Professional Practice goals – team and/or individual must be tied to one or more Performance Standards Rubric is used to analyze performance and determine ratings on each Standard and Overall Evidence is collected for Standards and Indicators; rubric should be used to provide feedback Rubric is used to assess performance and/or progress toward goals 19 Part III: Guide to Rubrics Pages 4-5 Massachusetts Department of Elementary and Secondary Education

  20. 20 Massachusetts Department of Elementary and Secondary Education

  21. Counselor reviews data and identifies three areas for improvement, grade 8 transition issues for special education students, YRBS data for students feeling emotionally safe at school, and low participation levels for students in Teen Screen program Continuous Learning 5 Step Cycle in Action for Specialized Instructional Support Personnel Counselor works with Director of Guidance to develop a department professional practice goal on Grade 8 Transition. Works with health educators, social workers, and school psychologists on a team student learning goal to improve emotional safety of students, and works with Behavioral Health Coordinator on a team student learning goal increasing percentage of students who participate in Teen Screen program. Counselor receives a rating on each standard plus an overall rating based on performance against standards and progress on the three goals. Counselor gathers and synthesizes evidence on progress on goals in Educator Plan. Director of Guidance focuses data collection on goal areas. Midway through the cycle, the Director of Guidance and counselor and department/teams to review evidence and assess progress on goals: makes adjustments to action plan or benchmarks, if needed.

  22. Four Different Educator Plans • The Developing Educator Plan (Non-PTS Teachers and teachers new to a position)is developed by the educator and the evaluator and is for one school year or less. • The Self-Directed Growth Plan (PTS Teachers)applies to educators rated Proficient or Exemplary and is developed by the educator. When the Rating of Impact on Student Learning is implemented (beginning in 2013-14), educators with a Moderate or High Rating of Impact will be on a two-year plan; educators with a Low Rating will be on a one-year plan. • The Directed Growth Plan (PTS Teachers)applies to educators rated Needs Improvement and is a plan of one school year or less developed by the educator and the evaluator. • The Improvement Plan (PTS Teachers)applies to educators rated Unsatisfactory and is a plan of no less than 30 calendar days and no longer than one school year, developed by the evaluator.

  23. Goal Setting ProcessFocus-Coherence-Synergy District Strategy Superintendent Goals School Committee School Improvement Principal Goals Plans Classroom Practice Teacher Goals Student Achievement

  24. Standards, Indicators and Rubrics • Standards (4)-Required in Regulations • Instructional Leadership (5 Indicators) • Management and Operations (5 Indicators) • Family and Community Engagement (4 Indicators) • Professional Culture (6 Indicators) • Indicators (20)-Required in Regulations • Elements (32)-May be modified, but most keep rigor • Rubrics • A tool for making explicit and specific the behaviors and actions present at each level of performance.

  25. The framework establishes four standards of practice, with supporting rubrics defining four levels of effectiveness * denotes standard on which educator must earn proficient rating to earn overall proficient or exemplary rating; earning professional teaching status without proficient ratings on all four standards requires superintendent review Massachusetts Department of Elementary and Secondary Education

  26. Model Rubrics: Structure Part III: Guide to Rubrics Page 6 Massachusetts Department of Elementary and Secondary Education

  27. Model Rubrics: Vertical Alignment within Rubrics Part III: Guide to Rubrics Appendix C, pages 2-4 • Example: Teacher Rubric • Standard I • “Standard I. Curriculum, Planning, and Assessment” • Indicator B • “Indicator I-B. Assessment” • Elements 1 & 2 • I-B-1: Variety of Assessment Methods • I-B-2: Adjustments to Practice Massachusetts Department of Elementary and Secondary Education

  28. Model Rubrics: Structure Part III: Guide to Rubrics Page 6 Massachusetts Department of Elementary and Secondary Education

  29. The Model Rubrics are Aligned 29 Massachusetts Department of Elementary and Secondary Education

  30. Rubric Alignment, e.g., Goal Setting Superintendent Rubric (I-D-1): Supports administrators and administrator teams to develop and attain meaningful, actionable, and measurable professional practice, student learning, and, where appropriate, district/school improvement goals. Principal/School-level Administrator Rubric (I-D-1): Supports educators and educator teams to develop and attain meaningful, actionable, and measurable professional practice and student learning goals. Teacher Rubric (IV-A-2): Proposes challenging, measurable professional practice, team, and student learning goals that are based on thorough self-assessment and analysis of student learning data. 30 30

  31. Alignment of Rubrics, e.g., Goal Setting 31 31

  32. Exemplary “The educator’s performance significantly exceeds Proficient and could serve as a model for leaders district-wide or even statewide. Few educators—principals and superintendents included—are expected to demonstrate Exemplary performance on more than a small number of Indicators or Standards.” Part III: Guide to Rubrics Page 14 Massachusetts Department of Elementary and Secondary Education

  33. Proficient “Proficient is the expected, rigorous level of performance for educators. It is the demanding but attainable level of performance for most educators.” Part III: Guide to Rubrics Page 9 Massachusetts Department of Elementary and Secondary Education

  34. Needs Improvement • Educators whose performance on a Standard is rated as Needs Improvement may demonstrate inconsistencies in practice or weaknesses in a few key areas. They may not yet fully integrate and/or apply their knowledge and skills in an effective way. They may be new to the field or to this assignment and are developing their craft.

  35. Unsatisfactory • Educators whose performance on a Standard is rated as Unsatisfactory are significantly underperforming as compared to the expectations. Unsatisfactory performance requires urgent attention.

  36. Example of Teacher Rubric • Standard I: Curriculum, Planning, and Assessment. The teacher promotes the learning and growth of all students by providing high-quality and coherent instruction, designing and administering authentic and meaningful student assessments, analyzing student performance and growth data, using this data to improve instruction, providing students with constructive feedback on an ongoing basis, and continuously refining learning objectives.

  37. Example • Indicator I-A. Curriculum and Planning: Knows the subject matter well, has a good grasp of child development and how students learn, and designs effective and rigorous standards-based units of instruction consisting of well-structured lessons with measurable outcomes.

  38. Example • Element A-1. Subject Matter Knowledge • Proficient-Demonstrates sound knowledge and understanding of the subject matter and the pedagogy it requires by consistently engaging students in learning experiences that enable them to acquire complex knowledge and skills in the subject.

  39. Multiple sources of evidence inform the summative performance rating Massachusetts Department of Elementary and Secondary Education

  40. Multiple sources of evidence inform the evaluation Evidence Standards Products of Practice (e.g., observations) Summative Performance RatingExemplary Proficient Needs Improvement Unsatisfactory • Outcomes for Educator: • Recognition and rewards • Type and duration of Educator Plan R U B R I C Standard 1 Standard 2 MultipleMeasures of Student Learning Standard 3 Other Evidence (e.g. student surveys) Standard 4 Attainment of Educator Practice Goal(s) and Student Learning Goal(s) as identified in the Educator Plan (Did Not Meet, Some Progress, Significant Progress, Met, Exceeded) Rating of Impact on Student Learning Low, Moderate, or High Trends and Patterns in at Least Two Measures of Student Learning Gains MCAS growth and MEPA gains where available; measures must be comparable across schools, grades, and subject matter district-wide Massachusetts Department of Elementary and Secondary Education

  41. Rating System Until Impact on Student Learning is Implemented in 2013-14/2014-15 Massachusetts Department of Elementary and Secondary Education

  42. Educators earn two separate ratings Massachusetts Department of Elementary and Secondary Education

  43. Educators earn two separate ratings Based on: Rating of Performance on each of 4 Standards + Attainment of Goals 44 Based on Trends and Patterns on state- and district-determined measures of student learning gains Massachusetts Department of Elementary and Secondary Education

  44. Phase-in Over Next 2 Years • Phase 1-Summative ratings based on attainment of goals and performance against the four Standards defined in the educator evaluation requirements (September, 2012) • Phase 2-Rating of educator impact on student learning gains based on trends and patterns of multiple measures of student learning gains (September, 2013) • Phase 3-Using feedback from students (for teachers) and teachers (for administrators)-(September, 2014)

  45. District Determined Measures (DDM)Timeline • September 30, 2013-All Districts expected to identify their district determined measures and their process for rating educator impact on student learning. • 2013-14 School Year: All districts implement the DDM. Non-level 4 districts may choose to use the 2013-14 school year as a pilot year to test out their DDM. • By October, 2014: Level 4 districts complete their collection of the first year of data on educator impact on student learning. No ratings assigned (2 Years required) • All other districts may either collect the first year of data on educator impact on student learning or consider the 2013-14 school year as a pilot.

  46. District Determined Measures (DDM)Timeline • By October, 2015: Level 4 districts report educator impact ratings to DESE. All other districts either collect the first year of data on educator impact on student learning or if they did not use 2013-14 school year as a pilot, report educator impact ratings to ESE based on ratings from the 2013-14 and 2014-15 school years. • By October, 2016: All districts report educator impact on student learning ratings to DESE based on the previous two years of impact data.

  47. District Determined Measures • Timeline may be different for administrators for MCAS, MEPA, AP Results • Measures of student learning should focus on growth, not just achievement • Growth measures will only be useful if they pertain to a relevant group of students for the educator being evaluated.

  48. Possible Examples of DDM • Direct Measures (Assess student growth in a specific subject area over time) • MCAS Growth Percentiles in Math and ELA • Other Standardized assessment of student achievement • Portfolios of student work • Performance assessments • Indirect Measures (Do not measure student growth in a specific subject area, but measure the consequences of that learning) • Changes in graduation rates • College enrollment rates • College remediation rates

  49. Roles of Educators • Teachers • PreK-High School • Special Education • ELL • Vocational Education • World Languages • Health, PE, Family and Consumer Science, Arts • Administrators • Superintendents • Other District Administrators • Principals, Assistant Principals • Teachers with supervisory responsibilities, including department chairs

More Related