1 / 68

Marshfield Public Schools District Determined Measures

Marshfield Public Schools District Determined Measures. Dr. Deborah A. Brady Ribas Associates, Inc. Do Now*. Please create a name tag or a “name tent” with your first name and school or department. Read the Table of Contents on page 1. fff Respond to the DO Now on page 2 of your handout.

shelby
Télécharger la présentation

Marshfield Public Schools District Determined Measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Marshfield Public SchoolsDistrict Determined Measures Dr. Deborah A. Brady Ribas Associates, Inc.

  2. Do Now* • Please create a name tag or a “name tent” with your first name and school or department. • Read the Table of Contents on page 1. fff • Respond to the DO Now on page 2 of your handout. *The materials are on line if you want to follow along and add notes at http://tinyurl.com/l7287z9

  3. The SCOPE of the Work PLANNING

  4. New Timetable (changes in handout on page 3)

  5. DESE is still rolling out the evaluation process and District Determined Measures 4 3 2 1

  6. NEW DESE Support for Teacher Evaluation and Alignment to the Common Core • Sample DDMs in the five required pilot areas • Technical Assistance and Networking sessions on September 19th • Technical Guide B (in this PowerPoint) addresses the practical application of assessment concepts to piloting potential DDMs and measuring student growth. • Model collective bargaining language • An ongoing Assessment Literacy webinar series • Guidance on constructing local growth scores and growth models will be released. • Guidance on determining Student Impact Rating will be released.

  7. Support from DESE • Additional Model Curriculum Units, which include curriculum-embedded performance assessments CEPAs • Guidance on the use of CEPAs as part of a DDM-strategy. • Professional development for evaluators on how to focus on shifts embedded in the new ELA and math Curriculum Frameworks during classroom observations. • Professional development for evaluators on how to administer and score DDMs and use them to determine high, moderate or low growth, focused on the five required DDM pilot areas. • A Curriculum Summit in November

  8. DDM Impact 2014 • Take advantage of a no-stakes pilot year to try out new measures and introduce educators to this new dimension of the evaluation framework. • Districts are strongly encouraged to expand their pilots beyond the five required pilot areas. • Fold assessment literacy into the district's professional development plan to stimulate dialogue amongst educators about the comparative benefits of different potential DDMs the district could pilot. • Consider how contributing to the development or piloting of potential DDMs can be folded into educators' professional practice goals.

  9. DDM Impact 2014 From the Commissioner: “Finally, let common sense prevail when considering the scope of your pilots. “I recommend that to the extent practicable, districts pilot each potential DDM in at least one class in each school in the district where the appropriate grade/subject or course is taught. “There is likely to be considerable educator interest in piloting potential DDMs in a no-stakes environment before year 1 data collection commences, so bear that in mind when determining scope.”

  10. 2014 Pilot Pilot Year SY2014 PILOT YEAR SEPTEMBER provide DESE a tentative plan for: • Early grade literacy (K-3) • Early grade math (K-3) • Middle grade math (5-8) • High school “writing to text” (PARCC multiple texts) • PLUS one more non-tested course, for example: • Fine Arts • Music • PE/Health • Technology • Media/Library • Other non-MCAS growth courses including grade 10 Math and ELA, Science DECEMBER—Implementation Extension Request Form for specific courses in the JUNE PLAN BY JUNE PLAN for all other DDMs must be ready for implementation in year 2 SY2015 At least one “local” (non-MCAS) and two measures per educator The scores will not count for those who pilot DDMs in 2014.

  11. SY 2015 • All professional personnel will be assessed with 2 DDMs, at least one of them will be locally determined: • All teachers • Guidance • Principals, Assistant Principals • Speech Therapists • School Psychologists • Nurses EXCEPT those waivered by DESE based on a case-by-case decision process. The scores will count as the first half of the “impact score” with the waivered courses as the only exception

  12. SY2016 • “Impact Ratings” will be given to all licensed educational personnel and sent to DESE • Two measures for each educator • At least one locally determined measure for everyone • Some educators will have two locally determined measures • The locally determined measure can be a standardized test such as the DRA, MAP, Galileo, etc. • The MCAS can be only one measure • The average of two years’ of scores “Impact Ratings” Are based upon two years’ growth scores for two different assessments, at least one non-MCAS score that is locally determined.

  13. Every educator earns two ratings Exemplary Proficient Needs Improvement Unsatisfactory High Moderate Low Summative Performance Rating Impact Rating on Student Performance 13 *Most districts will not begin issuing Impact Ratings before the 2014-2015 school year. Massachusetts Department of Elementary and Secondary Education

  14. Student Impact Rating Determines Plan Duration Impact Rating on Student Performance Massachusetts Department of Elementary and Secondary Education

  15. Acceptable (Standardized, but still considered District Determined) Assessments • MCAS can serve as one score for (ELA, Math, Science) • One or two locally developed assessments; some educators may have three • DESE Exemplars for the required piloted areas will be available in August 2013 • The MA Model Units Rubrics can be used • Galileo • BERS-2 (Behavioral Rating Scales) • DRA (Reading) • Fountas and Pinnell Benchmark • DIBELS (Fluency) ??? • MCAS-Alt • MAP • AP

  16. A Variety of Assessment Types • On Demand (timed and standardized) • Mid-Year and End-of-Year exams • Projects • Portfolios • Capstone Courses • Unit tests • Other Formats can include: • Multiple choice • Constructed response • Performance (oral, written, acted out)

  17. What kinds of assessments will work for administrators, guidance, nurses, school psychologists? • Use School-wide Growth Measures • Use MCAS and extend it to all educators in a school • Use “indirect measures” such as dropout rates, attendance, etc., as measures • Use Student Learning Objectives (SLOs)? • Team-based SLOs? • Or create measures. • A pre- and post- test are required to measure growth

  18. GROWTH SCORES for Educators Will Need to Be Tabulated for All Locally Developed Assessments MCAS SGP 244/ 25 SGP 4503699 230/ 35 SGP 225/ 92 SGP

  19. According to Technical Guide B (Summarized on page 31 of handout)Focus on the Following: • Is the measure aligned to content? • Is the measure informative?

  20. Entry point to DDM work: Two Focus Questions • Is the measure aligned to content? • Does it assess what is most important for students to learn and be able to do? • Does it assess what the educators intend to teach?

  21. Entry point to DDM work: Two Focus Questions • Is the measure informative? • Do the results of the measure inform educators about curriculum, instruction, and practice? • Does it provide valuable information to educators about their students? • Does it provide valuable information to schools and districts about their educators?

  22. Five Considerations • Measure growth • Common administration procedure  • Common scoring process • Translate to an Impact Rating • Comparability

  23. What is comparability? • Comparable within a grade, subject, or course across schools within a district • Identical measures are recommended • Comparable across grade or subject level district-wide • Impact Ratings should have a consistent meaning across educators; therefore, DDMs should not have significantly different levels of rigor

  24. Measuring Student Growth with DDMs

  25. Approaches to Measuring Student Growth • Pre-Test/Post Test • Repeated Measures • Holistic Evaluation • Post-Test Only

  26. Pre/Post Test • Description: • The same or similar assessments administered at the beginning and at the end of the course or year • Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year • Measuring Growth: • Difference between pre- and post-test. • Considerations: • Do all students have an equal chance of demonstrating growth?

  27. Repeated Measures • Description: • Multiple assessments given throughout the year. • Example: running records, attendance, mile run • Measuring Growth: • Graphically • Ranging from the sophisticated to simple • Considerations: • Less pressure on each administration. • Authentic Tasks

  28. Repeated Measures Example Running Record # of errors Date of Administration

  29. Holistic • Description: • Assess growth across student work collected throughout the year. • Example: Tennessee Arts Growth Measure System • Measuring Growth: • Growth Rubric (see example) • Considerations: • Option for multifaceted performance assessments • Rating can be challenging & time consuming

  30. Holistic Example Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho.  Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts

  31. Post-Test Only • Description: • A single assessment or data that is paired with other information • Example: AP exam • Measuring Growth, where possible: • Use a baseline • Assume equal beginning • Considerations: • May be only option for some indirect measures • What is the quality of the baseline information?

  32. Examples • Portfolios • Measuring achievement v. growth • Unit Assessments • Looking at growth across a series • Capstone Projects • May be a very strong measure of achievement

  33. Piloting District Determined Measures

  34. Piloting DDMs • Piloting: • Test • Analyze • Adjust • Repeat • Being strategic and deliberate: • Collaboration • Iteration • Information

  35. Pilot Steps: • Prepare to pilot • Build your team • Identify content to assess • Identify the measure • Aligned to content • Informative • Decide how to administer & score • Test • Administer • Score • Analyze • Adjust

  36. Analyzing Results: Example Focus Questions • Is the measure fair to special education students? • Is there differences in score due to rater? • Is growth equal across the scale?

  37. Analyzing and adjusting: Each DDM should have: • Directions for administering • Student directions • Instrument • Scoring method • Scoring directions

  38. Resources • Existing • ESE Staff • Part VII of the Model System • Technical Guide A • Assessment Quality Checklist and Tracking Tool • Assessment Literacy Webinar Series • Materials from Technical Assistance sessions • Commissioner's Memorandum • Technical Guide B • What’s Coming • Exemplar DDMs (August 30th) • Other Supporting Materials

  39. Considerations See page 5-6 of Handout for DESE recommendations Table or Partner Talk

  40. Time to Consider and Begin to Plan Pages 5-6 in Handout

  41. Can this be an opportunity? • Some options: • Writing to text 9-12? K-12? • Research K-12? Including Specialists • Art, Music, PE, Health • Math—one focus K-12? • Are there present assessments that might be modified slightly

  42. ONE PLAN Consider all of the options, concerns, initiatives, possibilities as you look at what the next step for your school and district should look like. Be ready to share this very basic “first think” on DDMs. After this, you will be given tools that will support your assessments of tasks and curricula’s quality, rigor, and alignment.

  43. “The task predicts performance”Elmore http://edworkspartners.org/expect-success/2012/09/21st-century-aligned-assessments-identify-develop-and-practice-2/ Page 2 (The DO Now) Process with a partner. Why might Elmore’s idea be germane to your planning? What can educators learn from DDMs?

  44. Tools to Facilitate the Work Tools to assess Alignment Tools to assess Rigor Tools to assess the quality of student work

  45. 2 DESE Tools to Facilitate the Tasks Quality Tracking Tool Educator Alignment Tool Temporarily Removed from doe web site. Interactive data base for all educators and possible assessments that could be used for each. It has been taken down from the web site temporarily. • Assess the Quality of your inventory of assessments • Also use Lexicon of Quality Tracking Tool Terms (in packet) • On DESE website http://www.doe.mass.edu/edeval/ddm/

  46. Tracking Tool Sample (page 11) • Checklist • Tracker

  47. Two Essential Quality Considerations Alignment Rigor • Alignment toCommon Core,PARCC, and the District Curriculum • Shifts for Common Core have been made: • Complex texts • Multiple texts • Argument, Info, Narrative • Math Practices • Depth over breadth

  48. Daggert’s Rigor/”Complexity” Scale “Task Complexity Continuum” 1 2 3 4 5 MCAS ` MCAS PARCC CC Aligned Classrooms ORQ Composition multiple Authentic Tasks ELA ORQ Math texts Simple/Complex

  49. Teacher Alignment Tool

More Related