1 / 44

Louisiana Leadership – Session 2: Framework to Move from Common Core to Classroom Practice

This session provides an overview of the LDC framework and focuses on analyzing current LDC implementation data, effective practices in writing LDC modules, text complexity, and scoring and analyzing student work.

rradford
Télécharger la présentation

Louisiana Leadership – Session 2: Framework to Move from Common Core to Classroom Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A framework to move from common core to classroom practice Louisiana Leadership – Session 2 October 23, 2014

  2. Overview of the Sessions

  3. Outcomes • Analyze current LDC implementation data and plan next steps for supporting teachers • Develop a deeper understanding of effective practices when writing LDC modules • Gain a deeper understanding of the role of text complexity • Calibrate expectations when scoring and analyzing student work

  4. Norms • What working agreements will help make today be successful for you?

  5. Opportunities/Expectations

  6. Periodic Scheduled Check-Ins Site-Based Data • Mode of Communication • Status of Implementation • Comments from Teachers • Structures for Supporting LDC Implementation

  7. Jurying Modules How are modules deemed ‘exemplar’? How can we support this process?

  8. Overview of the LDC Framework

  9. Jurying a Module Comparing Economic Systems • Section 1: What Task

  10. Good to Go: Task Clarity and Coherence • Template task uses a writing mode that matches the intended purpose of the prompt. • Task purpose is focused. • Prompt wording is clear. • Prompt wording is unbiased, leaving room for diverse responses. • Prompt wording, content, texts, and student product are aligned to task purpose (a “good fit”). • Task is text dependent, requiring students to go beyond prior knowledge to use evidence from the texts in their responses. • Background statement frames task for students.

  11. What Revisions? Comparing Economic Systems • Section 1: What Task • Overview (page 1)

  12. What Revisions? Comparing Economic Systems Section 1: What Task • Teaching Task Directions: • Identify the revisions • Determine if the module rating would change

  13. Task Revisions 1. Identify the revisions 2. Determine if the module rating would change

  14. Overview of the LDC Framework

  15. Jurying a Module Comparing Economic Systems • Section 2: What Skills? • Old: page 5 • New: pages 8-9 Directions: • Identify the revisions • Determine if the module rating would change

  16. Overview of the LDC Framework

  17. Jurying a Module Comparing Economic Systems • Section 3: What Instruction? • Old: pages 6-11 • New: page 8-21 Directions: • Identify the revisions • Determine if the module rating would change

  18. Overview of the LDC Framework

  19. Jurying a Module Revised Module: Comparing Economic Systems • Identifying Changes • Top 3 Most Effective Changes

  20. Top 3 Changes

  21. Teacher Self Assessment Tool

  22. Text Complexity • Quantitative Measures • Qualitative Characteristics • Considerations of Readers and Task

  23. Quantitative Dimensions …refer to those aspects of text complexity, such as word length or frequency, sentence length, and text cohesion, that are difficult … for a human reader to evaluate efficiently… and are thus today typically measured by computer software

  24. Qualitative Characteristics …refer to those aspects of text complexity best measured or only measurable by an attentive human reader, such as levels of meaning or purpose; structure; language conventionality and clarity; and knowledge demands. • Levels of Meaning (literary texts) or Purpose (informational texts) • Structure • Language Conventionality and Clarity • Knowledge Demands: Life Experiences (literary texts) • Knowledge Demands: Cultural/Literary Knowledge (literary texts) • Knowledge Demands: Content/Discipline Knowledge (informational texts)

  25. Matching Reader and Task …variables specific to particular readers (such as motivation, knowledge, and experiences) and to particular tasks (such as purpose and the complexity of the task assigned and the questions posed) must also be considered… Such assessments are best made by teachers employing their professional judgment, experience, and knowledge of their students and the subject.

  26. Analyzing Text

  27. Text Selection Support All • CCSS Appendix B • Readworks.org • Newsela.com • Tweentribune.com • NYTimes Learning Network • CNN Student News ELA • LDE ELA Guidebook Science • Sciencebuzz.org Social Studies • Library of Congress • Ourdocuments.gov

  28. Basics on Rubric • Seven scoring elements • Four performance levels • Four correlating score points • plus mid-point scores Reach Associates 2014

  29. What Results? – Section 4Scoring Student Work with the LDC Rubric • Can be used to score holistically or analytically • 2 rubrics – Informative/explanatory & Argumentative • 7 Scoring Elements: • Focus • Controlling Idea • Reading/Research • Development • Organization • Conventions • Content Understanding

  30. LDC Rubrics – Scoring v. Grading The LDC rubric… • provides feedback to students and teachers • helps students know expectations prior to completing the task • helps teachers gauge the effectiveness of their instructional choices

  31. LDC Rubrics – Scoring vs Grading • The rubric helps students know expectations before the task is completed, and where their strengths and weaknesses are after the task is completed. Reach Associates 2014

  32. Grading vs Scoring • Grading: • Reflects the performance of students relative to expectations at a particular point in time. • Scoring: • Uses fixed standards of quality that do not change over time. Reach Associates 2014

  33. LDC Rubrics – Scoring vs Grading • The LDC rubric is constructed for classroom use and to provide feedback to students and teachers. • It is not a summative rubric, as might be used in state exams to measure a set of absolute criteria. Reach Associates 2014

  34. Grading • 3.5 • 2.0 • 2.5 • 2.0 • 4.0 • 2.5 • 2.0 • Total = 18.5 • 18.5 divided by 28 total points = .66 • 18.5 divided by 7 elements = 2.64 Reach Associates 2014

  35. How Does LDC Look and Sound in a Classroom? Reach Associates

  36. Tips for Supporting the Implementation of LDC • Lesson Plans • Mini-Tasks • Collaboration Opportunities • Collaborative Scoring • Support from TOTs • Evaluation • Feedback • Join in the Conversations Reach Associates

  37. Supports Reach Associates 2014 What assistance is available?

  38. LDC Website www.ldc.org Reach Associates 2014

  39. Action PlansSupporting LDC Implementation • After today’s conversations, what will be the next steps to _____________? How will you do this? • Writing modules • Implementing modules • Scoring student work • Jurying and revising modules • Scaling LDC

  40. Next Time… • Please bring back a module written within your school, district or parish to jury.

More Related