1 / 17

Educator Evaluation Reform in New Jersey: Overview and Update

Educator Evaluation Reform in New Jersey: Overview and Update. Camden County Superintendents’ Roundtable June 8 th 2012 Voorhees Township. Evaluation System Reform: Origin and Trends. NATIONALLY

hung
Télécharger la présentation

Educator Evaluation Reform in New Jersey: Overview and Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Educator Evaluation Reform in New Jersey:Overview and Update Camden County Superintendents’ Roundtable June 8th 2012 Voorhees Township

  2. Evaluation System Reform: Origin and Trends • NATIONALLY • Abundance of research cites teacher effectiveness as the most important in-school factor for improving student achievement • The Widget Effect exposes failure of schools to distinguish among and recognize the effectiveness of their teachers: • Nearly all teachers are rated good or great • Excellence goes unrecognized • Inadequate professional development • No special attention to novices • Poor performance goes unaddressed • The Obama administration highlights evaluation reform as a key commitment tied to federal policy and funding opportunities • At least 32 states have recently changed their evaluation systems NEW JERSEY • Troubling achievement gaps • Current evaluations are subjective and fail to impact teaching practice • 50% of community college students never graduate

  3. Summary of Lessons Learned from Cohort 1 Pilots • Stakeholderengagement is critical for ensuring buy-in during initial implementation phase • District Evaluation Pilot Advisory Committee (DEPAC) meetingsthat are open to additional staff members help build a culture of trust, transparency, and two-way communication • Selection and procurement of a teaching practice observation instrument requires buy-in from stakeholders and a process taking 4-8 weeks • Quality observer and teacher training is critical to help ensure teacher understanding, the quality of observer feedback, and the reliability and accuracy of observer judgements • Capacity challenges exist for administrators in completing the increased number of observations, so these must be prioritized • Identifying and/or developing measures of student achievement for teachers ofNon-Tested Grades and Subjectspresents a significant challenge

  4. Plan for 2012-2013 School Year, Option 1 Based on lessons learned from pilot districts, EPAC, and national research and models, DOE will offer two options to districts in 2012-13: • Option 1: Participate in a new pilot of the evaluation system • Funding available for up to 20 districts • New pilot districts to help DOE continue to refine plans for a strong statewide system • Notice of Grant Opportunity posted March 28, 2012 with full details and application process; applications now under review • Current pilots may participate for an extended year through separate funding process

  5. Plan for 2012-2013 School Year, Option 2 Option 2: Take steps to prepare for full implementation in 2013-14 • Pilot the new system in some or all schools, if desired • At a minimum, meet the following milestones in 2012-13: • Form District Advisory Committee to ensure stakeholder engagement by November 2012 • Adopt a research-based observation framework and rubric with at least four differentiated levels of performance by January 2013 • Test frameworks/rubrics from January-August 2013 • Thoroughly train teachers by June 2013 • Thoroughly train observers by August 2013 • Complete progress reports on milestones in January and July 2013

  6. Initial Considerations for Districts State-wide Budget implications

  7. Funding Sources for Evaluation Work • Grant for selected Cohort 2 Pilot Districts • General fund unreserved surplus in the original 2012-13 budget; • Title IIA funds • See EE4NJ FAQ website for guidance • Race to the Top funds (for those participating and designating evaluation as an activity in their Scope of Work) • Title I • Title I, Part A funds may be used only for schools with school-wide Title I programs • Title I, Part A funds must supplement and not supplant state/local funds; if any components of the district’s teacher evaluation system become a state and/or local requirement, the district may no longer use Title I funds to support the system • School Improvement Funds (for those eligible) • SIA Part A • FY 2011 carryover funds used for this purpose must be encumbered prior to August 31, 2012 • School Improvement Grant (SIG) funds

  8. Selecting the Teaching Practice Observation Instrument • NJDOE is developing a process to approve observation instruments; districts and stakeholders will be able to submit instruments for review • There is NO one state-mandated model • The only difference between pilot and non-pilot district instrument selection in 2012-13 is that non-pilot districts may use a “home-grown” instrument • Creating or modifying an instrument intended to inform high-stakes decisions requires technical expertise and significant resources to develop an evidence base • Districts that adopt a home-grown or modified instrument must develop their evidence base over the course of the first year of implementation • NJDOE is developing a reporting and auditing process to ensure districts’ selected instruments meet all specifications

  9. Specifications for Evaluation Instrument (by 1/13 for non-pilots) • Instrument must have an evidence base documenting that it meets the following specifications and practices: • Produces scores or classifications of practice that shown (in practice or research studies) to differentiate a range of teaching performance • Has objective validity evidence on construct and concurrent validity • Aligns to 2011 InTASC Model Core Teaching Standards and provides rubrics for assessing teaching practice in at least the following domains of professional practice: • learning environment • planning and preparation • instructional practice/classroom strategies and behaviors • professional responsibilities and collegiality, including collaborative practice and ethical professional behavior • Provides scales or dimensions that capture multiple and varied aspects of teaching performance • Includes rubrics for assessing teaching practice that have a minimum of four levels of performance scores or classifications • Is implemented using classroom observations as a major component

  10. Specifications for Evaluation Instrument (cont.) • Has resources, which may be provided by an entity other than the instrument developer, that: • Provide applied examples of teaching performance across a wide range of skills and performance levels to be used for training teachers and observers and for observer certification or proof of mastery • Provide at least one skills assessment sufficient to determine that an observer is scoring at acceptably high levels of accuracy and consistency as compared to expert judgment, to allow certification or proof of mastery for observers applying the instrument • Certification or proof of mastery designation would be conferred on candidates who have successfully completed training and achieved a high level of accuracy as defined for that instrument and rubric • Permit calibration of observers’ application of the instrument at least once per year, • Provide ongoing support to teachers and observers, such as exemplar videos of teaching practice measured by the instrument • Support the district in building observer capacity, such as train-the trainer modules and video banks of teaching practice exemplars

  11. Non-Pilot Participants: Forming an Advisory Committee • Goal: Ensure stakeholder engagement in evaluation reform • Requirements: • Convene committee no later than November 2012 • NJDOE strongly advises committee be formed ASAP, as it will oversee and guide planning and implementation of the district’s evaluation policies and procedures • Ensure committee is composed of: • Teachers from each school level in the district • Central office administrators overseeing evaluation process • Administrators conducting evaluations • Superintendent • Special education administrator • Parent • Member of the district board of education • Additional members at the discretion of the superintendent

  12. Comparison of 2011-12 and 2012-13 Teacher Evaluation Pilots • Cohort 1 (2011-12) • Funding: $1.2M • Participants: 11 districts and 19 SIG schools • Observations: • Informal observations required • No unannounced observations required • Set duration and number • No differentiation between minimum number for teachers of core and non-core subjects • No requirements related to inter-rater agreement, use of external observers, double-scoring • Cohort 2 (2012-13) • Funding: $2.2M • Participants: Approximately 20 new districts and Cohort 1 districts that opt to continue • Observations: • Some unannounced observations required • More flexibility on duration and number • Minimum number differs for teachers of core and non-core subjects • New processes required to ensure inter-rater agreement and accuracy, including use of external observers and double-scoring of some sessions Cohort 1 and Cohort 2 • External researcher engaged • Stakeholder advisory committees required at state and district level • Communication plan and collaboration with NJDOE • Aligned professional development plan • Comprehensive training for evaluators and teachers • Commitment to develop and test measures of student performance

  13. Observation Requirement Changes: 2011-12 to 2012-13

  14. 2012-13 Teacher Evaluation Pilot Observation Requirements

  15. 2012-13 Teacher Evaluation Pilot Components

  16. 2012-13 Pilot Requirements for Evaluation System • Annual teacher evaluations based on standards of effective teaching practices; every teacher, regardless of experience, deserves meaningful feedback on teaching performance on an annual basis • Multiple measures of teaching performance and student performance, with student academic progress or growth as a key measure • A summative rating that combines the scores of all the measures of teaching practice and student achievement • Four summative rating categories (highly effective, effective, partially effective, ineffective) that clearly differentiate levels of performance • A link from the evaluation to professional development that meets the needs of educators at all levels of practice

  17. EE4NJ Website and Contact Information Website: http://www.state.nj.us/education/EE4NJ/ Contact information: • For general questions, please email ee4nj@doe.state.nj.us • Phone: 609-341-3306

More Related