1 / 35

Integrating Model Curriculum Units and District Determined Measures

Integrating Model Curriculum Units and District Determined Measures. Curriculum & Instruction Summit November 6-7, 2013. Today. District-Determined Measures (DDMs) Overview of Educator Evaluation Framework Definition and Role of DDMs Example Assessments 4 Key Messages to Share

sutton
Télécharger la présentation

Integrating Model Curriculum Units and District Determined Measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrating Model Curriculum Units and District Determined Measures Curriculum & Instruction Summit November 6-7, 2013

  2. Today • District-Determined Measures (DDMs) • Overview of Educator Evaluation Framework • Definition and Role of DDMs • Example Assessments • 4 Key Messages to Share • Using CEPAs in DDMs • Three Steps • Working with two MCUs as example

  3. Objectives • Understand what DDMs are and how they will be used to inform educators and evaluators. • Gain experience with three steps for using multiple existing assessments in a DDM.

  4. Overview of Educator Evaluation Framework Exemplary Proficient Needs Improvement Unsatisfactory High Moderate Low • Everyone earns two ratings Summative Performance Rating Student Impact Rating

  5. The Educator Evaluation Framework

  6. Student Impact Rating • Evaluators assign a rating based on trends (at least 2 years)and patterns (at least 2 measures) • Based on • Student Growth Percentiles (SGPs) • District-Determined Measures (DDMs)

  7. Definition of DDM “Measures of student learning, growth, and achievement related to the Massachusetts Curriculum Frameworks, Massachusetts Vocational Technical Education Frameworks, or other relevant frameworks, that are comparable across grade or subject level district-wide. These measures may include, but shall not be limited to: portfolios, approved commercial assessments and district-developed pre and post unit and course assessments, and capstone projects.” • Student Measures • Aligned to Frameworks • Comparable Across District • Expansive Definition of Acceptable Measures 603 CMR 35.02

  8. DDMs Inform: • Student Learning • Curriculum and Programs • Educator’s Impact

  9. Example Assessments Now Available • WestEd identified over 150 assessments that provide over 800 options for different grades/subjects and courses. • Options include: • Traditional and non-traditional assessments • Commercially available and locally-developed options (including submissions from MA districts – thank you!). • Multiple views of the data available: • By pilot area • By assessment coverage area • By approach (build, borrow, buy) • Full sortable HTML table Access Examples Here: http://www.doe.mass.edu/edeval/ddm/example/

  10. Example Assessments Now Available • Key features of each option are described in a one-page summary. • Each one-pager includes a link for additional information.

  11. Ongoing Work with WestEd • Collect more complete and open source assessments • We need your help! Please submit: • Assessment directions, materials, items and prompts • Scoring resources (rubrics, answer keys) • Other information (steps for administration, lesson plans, examples of how results are used) • To submit your examples, complete the survey and upload your materials at https://app.smartsheet.com/b/form?EQBCT=52e3762ac7954f93aa40cc879bc38855.

  12. DDMs: 4 Key Messages • DDMs are part of a multidimensional framework. • Decisions about an educator’s impact or effectiveness will never be based of the results of a single DDM. • Focus is on students, not just educators. • To be valuable, DDMs must yield information that will be useful to educators in improving student outcomes. • This is about building capacity. • DDMs provide districts a good reason to consider ways to refine and improve existing assessment practices. • Teachers have the necessary skills to lead the process of identifying DDMs – You can do this! • Many districts have already had success leveraging teachers in developing DDMs.

  13. How Do We Use Curriculum Embedded Performance Assessments (CEPAs) in District Determined Measures (DDMs)? Using Current Assessments in DDMs: Leveraging the Curriculum Embedded Performance Assessments from ESE’s Model Curriculum Units

  14. Advantages of Using CEPAs Are aligned to MA Frameworks • Assess what is most important for students to learn • Assess what the educators intend to teach Are informative • Inform educators about curriculum, instruction, and practice • Provide information to educators about their students Final Step • Provide valuable information to schools and districts about educators’ impact

  15. Considerations in DDMs • Performance vs. Growth • Most CEPAs are measures of performance and not growth • Growth takes into account the different levels of student achievement • Provides all students an equal opportunity to demonstrate growth • Unit vs. Year • Most CEPAs are assessments after a short unit (10 lessons) • A Measure from during the year provides a more accurate representation of student growth and of an educator’s impact

  16. Potential Solutions • Bringing together multiple assessments, (e.g., CEPAs) to measure performance at different points in time • Multiple administrations of one assessment (or similar versions) during the year • Using multiple measures of growth to make a single DDM

  17. The larger DDM context

  18. Using CEPAs: Three Steps • STEP 1: Identifying Key Content: What content best represents a meaningful sample of the learning a student should complete during the year? • Ensuring that change in performance represents growth: • Selecting an approach for measuring growth:

  19. Step 1: Identifying Key Content • Key Content includes • What content best represents a meaningful sample of the learning a student should complete during the year? • What content is particularly challenging to teach and for students to learn? • What content is not assessed currently? • What content reflects a district priority? • Example: look across the year using a curriculum map to determine what is meaningful sample.

  20. Example: 3rd Grade ELA

  21. Types of Key Content • Key content may be taught repeatedly across the year • Key content may be taught once during the year

  22. Key Content: Activity • (10 minutes): In small groups, discuss your district’s curriculum for the content and grade covered by the MCU provided. • Identify examples of key content. Why did you choose these examples? Key Content = What content represents a meaningful sample of the learning a student should complete during the year?

  23. Key Content: Activity (10 Minutes) Share Out. What content did you choose? Explain your thinking.

  24. Using CEPAs: Three Steps • Identifying key content: • Step 2: Ensuring that change in performance represents growth: Are the assessments similar enough to support meaningful inferences about student growth during the year? • Selecting an approach for measuring growth:

  25. What is Growth?

  26. Step 2: Does change represent growth? • Are the assessments similar enough to support meaningful inferences about student growth during the year? • Do early assessments provide meaningful information about what students do not understand? • Do future assessments provide meaningful information about what students have learned? • Do students have the opportunity to demonstrate different levels of growth?

  27. Example: 3rd Grade ELA • In the third grade example • Each unit has a CEPA covering the same content • Rubrics are different, hard to compare • Solution • Identify similar evidence • Modify rubrics by adding consistently worded rubric items that assess the same content

  28. Looking Across Assessments

  29. 3rd Grade ELA Example: Advantages • Do not need to change what students do • Rubrics are on a consistent scale • Each rubric doesn’t need to be identical • Still include important rubric items that are present in only one CEPA (for example, use of technology in the Newspaper Unit)

  30. Does change represent growth? : Activity • Begin by assuming that the content covered in the CEPA represents Key Content. • (8 minutes) In small groups, think about how you assess growth in this Key Content. • What other assessments do you have (or could you create) that are aligned to this key content? • Would the assessments be similar enough to support meaningful inferences about student growth? • What are the strengths and weakness of using these assessments to support a claim of growth?

  31. Is Change Growth?: Activity (7 Minutes) Share Out. • What assessments best lent themselves to supporting claims of growth? • What were your primary concerns about making a claim of growth? • What modifications did you think would support making a claim about growth?

  32. Using CEPAs: Three Steps • Identifying key content: • Ensuring that change in performance represents growth: • Step 3: Selecting an approach for measuring growth: What scoring approach best captures student learning?

  33. Measuring Growth: Activity • (10 min) In groups of four, assign a color to each team member. Form “expert” groups by approach. These groups read and discuss their approach. • (10 min) Return to your original group. Share out the four different approaches. If you have time, you can discuss the advantages and disadvantages of each approach.

  34. Approaches to Measuring Student Growth • Pre-Test/Post Test • Repeated Measures • Holistic Evaluation • Post-Test Only Learn more Webinar 5 http://www.doe.mass.edu/edeval/ddm/webinar.html Technical Guide B http://www.doe.mass.edu/edeval/ddm/TechnicalGuide-AppxB.pdf

  35. Questions and Follow Up • Questions about the process of developing and improving DDMs can be directed to Craig Waterman at cwaterman@doe.mass.edu • Policy questions about DDMs and the teacher evaluation framework can be directed to Ron Noble at rnoble@doe.mass.edu

More Related