1 / 56

The Future of Special Education Teacher and Administrator Evaluations

The Future of Special Education Teacher and Administrator Evaluations. 2013 Summer Institute Presentation by Vickie L. Coe Jordan M. Bullinger LaPointe & Butler, P.C. Part One. Educator Evaluations . MI Race to the Top Legislation.

nowles
Télécharger la présentation

The Future of Special Education Teacher and Administrator Evaluations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Future of Special Education Teacher and Administrator Evaluations 2013 Summer Institute Presentation by Vickie L. Coe Jordan M. Bullinger LaPointe & Butler, P.C.

  2. Part One Educator Evaluations

  3. MI Race to the Top Legislation • Enacts legislation in 2009, effective January 4, 2010 amending the Revised School Code by adding 380.1249. • Performance evaluation system for teachers and school administrators • Governor’s Council on Educator Effectiveness • Recommendations on evaluation processes • Effectiveness label

  4. 380.1249 Teacher Provisions

  5. 380.1249 Teacher Provisions, cont.

  6. 380.1249 Teacher Provisions, cont

  7. 380.1249 Administrative Eval Req.

  8. Part Two Introduction to MCEE Report

  9. Caveats • The following slides contain the MCEE’s recommendations. • The MCEE’s recommendations are NOT the law. • This presentation is not intended to be a labor or tenure law tutorial. • This presentation is not intended to provide legal advice. • The legislation and the MCEE Report reference a waiver process.

  10. Important Definitions Adopted by the Report • Teacher • Individual directly responsible for instruction aimed at helping a group of students reach goals defined in a well-specified curriculum over an extended period of time such as a quarter, semester, or academic year. (Report – 6 (hereinafter R-__)) • Teacher leader/master teacher • A teacher whose performance has been rated “professional” for at least 3 years in a row and who has additional skills in supporting the development and improvement of practice. (R-6)

  11. Important Definitions, continued • Non-teacher • Report indicates that it does not address non-teacher evaluations. • Includes ancillary staff as an example of non-teacher. • Leaves responsibility to define and evaluate non-teachers to the LEA. (R-11)

  12. Important Definitions, continued • Professional • Exhibits knowledge and capabilities expected of a skillful educator. • A teacher rated professional for three consecutive years: • May pursue opportunities for advanced roles or leadership. • Receive bi-annual evaluation and receive two-year improvement plan (goals). (R-8) • MCEE believes that majority of teachers will receive a professional rating.

  13. Important Definitions, continued • Provisional • Exhibits some professional knowledge and skill, but has specific substantial identified weaknesses that should be addressed through feedback and targeted PD. • A teacher rated provisional or below for three consecutive years should be counseled out of his or her current role. (R-8)

  14. Important Definitions cont. • Ineffective • Exhibits performance that has specific critical identified weaknesses. • A teacher who receives an ineffective rating for two consecutive years should be terminated from further employment as a teacher in current LEA. (R-8)

  15. Teacher Evaluation Overview

  16. Part Three Practice

  17. “Practice” • Practice • The work that teachers do to prepare and conduct instruction and to assess, communicate about and improve students’ learning. (R-7)

  18. Evaluation of Practice

  19. Evaluation of Practice, continued

  20. MDE/LEA “Practice” Responsibilities (R-10)

  21. Observer Responsibilities

  22. Observer Responsibilities, cont.

  23. Observer Responsibilities, cont.

  24. Observation Mechanics (R-11-12)

  25. Part Four Student Growth and Assessment Data

  26. “Student Growth” • Student growth • Refers to the change in students’ knowledge and skills across time. (R-13)

  27. Linking Growth to Instruction

  28. Growth and Assessment Tools • Assessments useful for measuring student growth can be provided in multiple ways: • Full service assessments, which are developed, administered, scored, and reported on centrally. • Model assessments, which are developed centrally, but administered, scored and reported locally. • Locally developed assessments, which are developed, administered, scored and reported on locally. (R-14)

  29. State Obligations re: Student Growth & Assessment Tools (R-15)

  30. MCEE Recommendations re: SGATs

  31. Tool Recommendations (R-16-21)

  32. Tool Recommendations continued

  33. General VAM Concepts (R-20) • Relationship to student growth and assessment tools • Statistical models that use data from SGATs to produce estimates of “value added” by individual educators to student learning. • How? • By controlling for factors over which educators have little to no influence e.g., incoming achievement, race/ethnicity, socioeconomic status, gender, special education status, and English language learner status. • Measures of value added for an individual educator are based on the typical deviation of his or her students’ achievement or growth from the achievement or growth those students were expected to demonstrate given previous achievement and/or other factors over which the educator has little to no influence. • Highly controversial; considerable scientific disagreement.

  34. General Propositions (R-20-21) • MCEE’s conclusion: when comparing the use of VAM data to the alternative of district-developed data models of teaching effects, the MCEE believes that VAMs are more reliable evidence. • VAMs should only be based on assessment data that provide information that support valid and reliable inferences about students’ growth. • In subject areas for which there are no VAM data available, educators should be evaluated based on alternate measures of student growth and achievement. • LEAs decide method for combining multiple VAM measures when teachers teach multiple content areas or sections. The state should provide VAM scores for individual educators on all state-mandated assessments and all optional assessments offered by the state.

  35. Key Value-Added Recommendations

  36. Complications in Calculating VAM

  37. Combining Scores • During the first two years of implementation (2015-2016 and 2016-2017), LEAs will: • Produce teacher ratings based on qualitative combinations of the categories. • Conduct a standard-setting process based on data so that longer term will have profiles of teachers’ performance on specific aspects of instructional quality and student progress.

  38. Example of a Qualitative Combination of Evaluation Categories

  39. Example of Proportional Relationship of Evaluation Data • Applies to teachers in core content areas in grades for which growth data from state-mandated assessments exists. • Assumes, at a minimum, teacher-level VAMs comprise 50% of student growth section and 25% of the overall evaluation. Also assumes building-level VAMs comprise 10% of student growth section and 5% of overall evaluation. “Other” measures comprise 20% of practice section and 10% of overall evaluation.

  40. Part Five Administrator Evaluations

  41. Who are Administrators? • According to the law, administrators are: • Principals • Assistant Principals • Curriculum coordinators • Superintendents • Assistant Superintendents • Career and technical education managers • Special education directors

  42. Evaluation Process Overview

  43. Evaluation Chronology • By October 1st, administrators and supervisors meet • School principals and their supervisors are to establish an agreement with regard to how required “evidence” factors (see prior slide) will inform the summative decision at the end of the year. • Other administrators will discuss how comparable information will inform the evaluation process. • By February 1st, mid-year evaluation conference • Supervisors provide verbal and written feedback, including relevant information on teacher evaluation, student and parent feedback, attendance rates, and school improvement. • Supervisors also provide clear information on any areas of concern that should be addressed by June.

  44. Evaluation Chronology continued • Summative evaluation meeting at the end of the school year. Administrators provided with: • Verbal and written feedback • “Final” evaluation rating using both the selected rubric and the information provided on teacher evaluations, school improvement progress, attendance, and student, teacher, and parent feedback. • Note: If state-run VAM estimates are component of evaluation, final evaluation rating unlikely until August since not released prior. MCEE recommends that, if decisions about personnel need to be made before state-run VAM data released, supervisors should use available evaluation data.

  45. Administrator Ratings • Professional (see slide 13) • Same definition as for teachers with a slight variation. “Professional” administrator “exhibits the knowledge and capabilities expected of a skillful leader. • Provisional (see slide 14) • Same definition as for teachers • Ineffective (see slide 15) • Same definition as for teachers

  46. Calculating Administrator Ratings

  47. MCEE Recommendations re: Combining Scores (R-27) • Weight of measures: • During the 2013-2014 and 2014-2015, the MCEE recommends that LEAs may use student growth as a significant component, but not more than 50% of an individual administrator’s evaluation. • To prepare for full implementation, LEAs should pilot the use of SGAT as 50% of an administrator’s evaluation. • As with teachers’ evaluations, final judgments will be determined by combining student growth scores and practice scores.

  48. MCEE Recommendations re: MDE/LEA Responsibilities (R-25-26)

  49. Part Six Waiver Process

More Related