1 / 43

Teacher and Principal Evaluation Project Update MAG 2013 November 21, 2013 Dave Volrath, MSDE TPE Action Team Lead Ben

Teacher and Principal Evaluation Project Update MAG 2013 November 21, 2013 Dave Volrath, MSDE TPE Action Team Lead Ben Feldman, TPE Action Team. Agenda. Maryland State TPE Teacher Model Learnings from the 2012-13 Field Test Getting the work done Moving the money to the LEAs

lora
Télécharger la présentation

Teacher and Principal Evaluation Project Update MAG 2013 November 21, 2013 Dave Volrath, MSDE TPE Action Team Lead Ben

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Teacher and Principal Evaluation Project Update MAG 2013 November 21, 2013 Dave Volrath, MSDE TPE Action Team Lead Ben Feldman, TPE Action Team

  2. Agenda • Maryland State TPE Teacher Model • Learnings from the 2012-13 Field Test • Getting the work done • Moving the money to the LEAs • Spheres of Influence • How will we know? • Sense of confidence from the field • Independent partnerships • Toward a consistent data collection • Waivers, new assessments

  3. StateTeacher Evaluation Model Professional Practice Student Growth 50 % Qualitative Measures Domain percentages proposed by LEA and approved by MSDE 50 % Quantitative Measures As defined below Planning and Preparation 12.5% Instruction 12.5% Classroom Environment 12.5% Professional Responsibilities 12.5% Elementary/Middle School Teacher One Tested Area 20% MSA Lag Measure based on either 20% Math or 20% Reading 15% Annual SLO Measure as determined by priority identification at the district or school level 15% Annual SLO Measure as determined by priority identification at the classroom level High School Teacher Tested Subjects 20% SLO Lag Measure based on HSA Algebra, HSA English 2, HSA Biology, or HSA American Government and including an HSA data point 15% Annual SLO Measure as determined by priority identification at the district or school level 15% Annual SLO Measure as determined by priority identification at the classroom level K-12 Non-Tested Area/Subject Teachers 20% SLO Lag Measure based on School Progress Index Indicators ( Achievement, Gap Reduction, Growth, College and Career Readiness), Advanced Placement Tests, or similarly available measures 15% SLO Measure as determined by priority identification at the district or school level 15% Annual SLO Measure as determined by priority identification at the classroom level Elementary/Middle School Teacher Two Tested Areas 20% MSA Lag Measure based on 10% Reading and 10% Math 15% Annual SLO Measure as determined by priority identification at the district or school level 15% Annual SLO Measure as determined by priority identification at the classroom level or or or

  4. Discarded MSA Conversion Methods 2010 RTTT Grant MAG 2011 MAG 2012

  5. Maryland Tiered Achievement Index • M-TAI 2 is informed by extensive proof of concept testing in multiple LEAs. Ten cells change. • Increasing the premium blue area was recommended by USDE (6 cells) • Because the MSA has scant discriminatory power above the advanced cut, holding an advanced student within that performance band is treated as a year’s growth (1 cell). • By extension, holding a student within proficient by one performance level is also construed as a year’s growth and is not penalized (2 cells). • Cell A1P3 is a “freak” cell that fluoresces occasionally, e.g., in grade 6 reading. On those rare occasions, as many as 6,000 students land in that cell. It reflects a phenomenon of the test, not of the students or teachers, and deserves mitigation (1 cell). • The basic innovation is a widening of the status diagonal.

  6. Borrowing Calvert County’s approach to using the Standard Deviation to interpret performance Performance spanning the grade mean by one standard deviation is considered expected and acceptable (green bracket). Growth more than .5 STD above mean is beyond expected and commendable (blue bracket). Performance .5 STD below the central range is concerning (yellow bracket); performance a full STD below mean is a significant loss and unacceptable (red bracket). Slide borrowed from CCPS presentation, March 11, 2013

  7. M-TAI Means, STDs, and Tiers Using Spring 2013 MSA Data The model generally centers around 2.0, a year’s acceptable growth. The MSA swings, e.g., from math grade 4 to 5 or reading grade 5 to 6, are evident. A goal of M-TAI 2 was to be sensitive to this and to mitigate it.

  8. Statistical considerations • The means and standard deviations are derived from the statewide distribution of students. • The performance tiers are to fit teacher averages. • A purist model would divide the STD by the square root of the class n… • But that would overly constrain the data and force many teachers out of effective.

  9. Shared Measures Discarded from the State Teacher Model20% MSA/30% SLO v. 10% MSA/10% SPI/30% SLO Top 2 and bottom 2 panels sorted by Delta (n=182 total)

  10. Summary of Changes and a New Approach to Doing the Work • The State Model has SIMPLIFIED and is reflective of best practices gleaned from LEA models. • Shared measures are an imperfect fit for teachers • Lagged data does not have to be a fatal flaw • Reconceiving evaluation as a continuous cycle changes everything

  11. Translating MSA to a % Lagged data are part of a continuous evaluation cycle, not a liability. Extracting Teacher & Principal Measures Setting SLOs Conducting pre conferences Writing the SIP Data Analysis Observing Professional Practice Aligning SIP Goals Conference Reviewing Annual Data Evaluation Professional Practice

  12. The 2012-13 All LEA Field Test • All 22 RTTT LEAs participated • The sample included 20% and 18% of all Maryland principals and teachers respectively • Four LEAs tested the pure State model and provided detailed data for analysis and modeling • Following results reflect all 22 LEAs and details from the 3 + 1 piloting LEAs

  13. Distribution of May 2013 SubmissionsTotal State Teachers N = 8,047

  14. Distribution of May 2013 SubmissionsTotal State Principals N = 243

  15. Descriptive Statistics from Four LEAs Providing Detailed TPE Teacher Data

  16. Descriptive Statistics from Four LEAs Providing Detailed TPE Teacher Data

  17. Caveats and Observations • LEAs tended to use go-to schools and go-to staff. • We have better understanding at the top of the scale. • SLO scores were sometimes incomplete and were imputed or treated as a default 67% • About 18% of all teachers seem consistently highly effective. • Preliminary scores suggest that 78-82 basis points is a sound place for the effective/highly effective cut. • NO teacher would miss highly effective based on a State Assessment. • This also implies we know little about the bottom of the scale, a next Big Topic.

  18. The Approval Process for LEA Models • LEAs submitted preliminary plans in December 2012. • As the State Model was “hammered out” LEAs gained confidence in the collaborative work. • LEAs submitted qualifying plans, endorsed by local bargaining units, in June 2013. • All RTTT 22 LEAs have APPROVED plans.

  19. How do the LEA plans match or differ? • LEA plans are more LIKE the State Model than dissimilar. • SLOs are universally embraced. • M-TAI broadly accepted. • Occasional variation in setting the weights on the Professional Practice side. • A variety of scales in play: 100 basis points, 4.0 GPA, 10 basis points, unrelated scales. • Most LEAs have dropped shared measures; a few still use local School Wide Indices. • Localized interest in student surveys.

  20. Moving the Funds to the LEAs • No LEAs wanted a centralized State system when their own systems were maturing. • Grants were awarded by formula which included inverse size, wealth, burden, and academic challenge. • Four categories of assurances were required. • Over $1.5M was released to LEAs.

  21. Four Assurance Categories • Quality Controls: Does the LEA have a quality model they can implement? • Professional Development: Is the LEA, the executive officers, the principals, the teachers, and communication officers READY? • School-based evaluator needs. • District level needs. LEAs had to satisfy category 1, then 2. If 1 and 2 were satisfied, 3 and 4 were discretionary.

  22. Distribution of 90% of Grant Awards: Areas Selected by Multiple LEAs

  23. How will we know? • Assurances • Closing the Information Loop • Grant Linkages…Implementation and Sustainability • Proactive Outreach to LEAs • Validation • Quality Control Group • Collecting evidence and artifacts • QER / Reform Support Network • MAAC@ WestEd…Critical Friend • Reaching Principals and Teachers

  24. How will we know? • Organized around Levels of Confidence • Focus on achievement of Spheres 1 and 2, and readiness to undertake Sphere 3 • New interactive approaches using Webinars and real-time polling. • New format: mini eConferences • Gathering consistent data, narratives, and artifacts

  25. “Spheres of Influence”

  26. Sphere Design Sphere One Calendar • June 12 LEA PD Coordinators • July 9-10 Executive Officers Summit • Aug. 7 PSSAM Executive Board • Aug, 15 Assistant Principals MASSP • Aug. 19 Communication Bulletin #19 • NA Assistant Superintendents • Aug.23 Superintendents • Aug.29 Quality Control Session • Sept. 2 Communication #20 • July 11-Sept.19 • LEA Direct Assistance

  27. How will you know ? Quality Control By Design USDE Assurances (24) The LEA has a process for attributing students to the teacher(s) of record, including affording each teacher an opportunity to review and confirm the roster. The LEA has a strategy and the resources to execute their TPR communication plan • Evidence of Quality • Sample LEA artifact provided to principals showing MSA scores translated to % for teachers in a school • Sample school artifacts of internal TPE school communications 6/12-9/2 By the end of Sphere 1, leadership personnel should know and be able to conduct beginning of the year pre-evaluation conferences that include reporting the teacher’s or principal’s MSA translation scores, the setting of teacher or principal SLOs, and a basic understanding of how to construct three year-cohorts, and plan the evaluation workload for the 2013-2014 school year. Sphere One Outcome 1 24 Sphere Calendar June12-Sept 19

  28. Quality Control Session • Completes each Sphere. • Panel includes a lead person from each LEA and from MSEA, MAESP, MASSP, and PSSAM. • Provides critical feedback whether the goals of the Sphere have been achieved. • Are part of the critical body of evidence to answer the “How do you know” question.

  29. Distribution of Quality Assurance Session October 30, 2013 Polling Responses

  30. N/R = No Response provided

  31. A QCS Outcome: LEA SLO Webinars Stark and Griffin, Wicomico: Web-based SLO Management System                           The Wicomico Evaluation Portal houses a web-based SLO management system. Linked to the district’s student data management system, teachers develop SLOs using a template, select students from drop-down class lists, set target attainment values, and upload associated documentation among other functions. Using built-in workflow functionality, principals access the system to review the SLO, offer comments and suggestions, and ultimately approve the SLO. Central Office staff including content supervisors may access SLOs to offer advice to teachers and principals during the SLO development and approval process and to monitor the rigor of SLOs across schools. Eventually the Evaluation Portal will house the results of other components of the teacher evaluation including the professional practice scores and assessment results. Alisaukas, Carroll: Managing the Workload of SLOs During this webinar, Carroll County Public Schools will describe their system for managing the logistics that come along with SLOs. Specifically, the webinar will focus on our user-friendly electronic SLO storage, review, and approval process. Embedded in our process is a communication feature that allows principals to get feedback from content supervisors without cumbersome communications. No e-mail attachments, no faxes. Administrators across the system log into our application to review any SLO at any time. As student data management is major component of SLOs, and the Student Growth Component in general, time will also be spent on our application that has been designed to allow teachers and administrators to easily access and analyze data. Lawson, Cecil: An Online Approach to Managing Instructional Leadership through the Teacher Evaluation Instrument Cecil County Public Schools has contracted with a non-profit data services agency in Delaware (The Data Service Center) who hosts the school system’s evaluation process. This includes all SLO management, teacher observation processes, rubrics, forms and evaluation process. This model, currently in its first year of implementation, is performing beyond expectations. In particular, the process helps our system organize SLOs in such a way that allows for principals, coordinators, system leaders and teachers to see SLO selection, progress and ensuing data that supports the SLO process. Kubic, Anne Arundel SLO Templates and Data Tables Navigating the mathematics necessary to complete an SLO is an unnecessary barrier to writing a quality SLO. AACPS created six Target Templates with corresponding Data Tables to help all teachers complete the mathematics required in SLO Targets. With so many moving parts in our TPE models, AACPS created an Administrative Management Tool that assists school leaders in tracking SLOs, scheduling conferences for both SLOs and observations, and scheduling the rating. Attend this mini-webinar to learn about how this tool works to support schools in effectively using their “TPE Time.

  32. Mini Webinar Links(Don’t miss the Charles County Presentation today at MAG) Wicomico: Web-based SLO Management System https://www2.gotomeeting.com/register/731525618 Carroll: Managing the SLO Workload https://www2.gotomeeting.com/register/610097426 Cecil: An Online Approach to Managing Instructional Leadership through the Teacher Evaluation Instrument https://www2.gotomeeting.com/register/510615666 Anne Arundel: SLO Templates and Data Tables https://www2.gotomeeting.com/register/121260274 MSDE: Sphere of Influence Technical Assistance: Focus on SLO Management https://www2.gotomeeting.com/register/209224442

  33. West Ed Updates: Survey Findings • There is a substantial learning curve involved with SLOs. There are concerns regarding the SLO-related support teachers receive • Field tests, meaningful information, and union/management collaboration make a difference. There are concerns about communication and community support. • There are concerns about the capacity of principals to serve as effective evaluators. There are questions regarding classroom observations and technology. • Many districts are inadequately prepared to make use of final evaluation results.

  34. West Ed Updates: Recommendations • Broaden capacity building • Provide specialized support to principals • Help districts translate evaluation ratings into improved practices • Develop a rapid response capability • Develop a rapid response capability

  35. Update on the Quality Evaluation Rollout • Charles County was the LEA representative at the October 11 convening. • Dashboard and Scorecards are QER’s present interest. • Here’s the MSDE Page where we posted: http://marylandpublicschools.org/MSDE/programs/tpe/co.html • Direct links to their materials on our page: • Presentation • Examples

  36. Looking ahead to more valuable information: LEA dashboards, vendor roles, consistent teacher data collections, and a State Scorecard and a State Scorecard

  37. Principal Data

  38. Questions Required by USDE on the APR Report: Planning Now • N and % of teachers and principals with qualifying evaluation systems who were eligible for tenure during the prior year • N and % of teachers and principals with qualifying evaluation systems who were • Granted tenure or full certification • Retained or terminated • Experienced compensation decisions

  39. Questions Required by USDE on the APR Report, Continued • N and % of schools that were: • High poverty, high minority, or both • Low poverty, low minority, or both • N and % of teachers and principals who were • Highly effective; effective or higher; or ineffective • Evaluated in schools meeting the above criteria • Breakouts for mathematics, science ESOL/ELL, and Special Education teachers

  40. Waivers • From Education Week: Education Department to Scale Back Key Waiver–Renewal Mandates • What’s in play: • MSA-related measures set aside for personnel decisions • Double testing • Continuing the ESEA waiver and its implications • What’s still expected: run the approved model and submit the complete rating with its components.

  41. Thinking Ahead to PARCC • PARCC will return 5 levels instead of 3. • The 9*9 M-TAI matrix can be revisited as a 9*5 using the insights gained this year: an empirical model in lieu of a theoretical approach. • 2014-15 will be a year of intensive learning for everyone.

  42. DRAFT: FOR DISCUSSION ONLY The Maryland Teacher and Principal Evaluation Guidebook August 2013, Version 3 Old version: 207 Pages New Version: 20 Pages

  43. TPE Contacts Dave Volrath, Lead dvolrath@msde.state.md.us, 410 767 0504 Ben Feldman, TA Strand befeldman@msde.state.md.us, 410 767 0142

More Related