1 / 60

Student Learning Objectives (SLO)

VALLEY GROVE SD 2014 ~ SLO Workshop. Student Learning Objectives (SLO). PPT SOURCE: Dr. Cathleen Cubelic Cathleen.cubelic@miu4.org. Our Objectives. Understand what is an SLO Understand Process: Design , Build, & Review Consider Assessment Quality and Purpose

pepin
Télécharger la présentation

Student Learning Objectives (SLO)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. VALLEY GROVE SD 2014 ~ SLO Workshop Student Learning Objectives(SLO) PPT SOURCE: Dr. Cathleen CubelicCathleen.cubelic@miu4.org

  2. Our Objectives • Understand what isan SLO • Understand Process: Design, Build, & Review • Consider Assessment Quality and Purpose • Examine Webb’s DOK in reference to Assessment • Collaborate for implementation • Build the SLO on the Template • Use online tools

  3. SLO & Assessment Literacy Pre-Test

  4. Student Learning ObjectivesMIND DUMP • What do you know? • What have you heard? • What have you researched? • Why are we doing this? • Anything else?

  5. Student Learning ObjectivesYOUR SLO… • …written specific to you and a specific class/course/content area for which you teach. • Every teacher designs one. • Collaborative development is encouraged. • Design, Build, Review – Repeat next year/cycle • In the interest of all students • To improve the program • Knowing the good teaching matters most Many Factors and Decisions to make: time frame, course content, learning needs, goal, assessment, measures, indicators… Local Decisions!!!

  6. The Rating Tool PDE 82-1 2014-15 …PVAAS Rostering

  7. The SLO in PA is written in relationship to a specific teacher and a specific class/course/content area for which that teacher provides instruction.

  8. “The PSSA test doesn’t completely measure my effectiveness.” SLO CONCEPTS STUDENT ACHIEVEMENT can be measured in ways that reflect authentic learning of content standards. EDUCATOR EFFECTIVENESS can be measured through use of student achievement measures

  9. SLO Definition A process to document a measure of educator effectiveness based on student achievement of content standards.

  10. SLO Process The SLO process contains three (3) action components: • Design (ing): thinking, conceptualizing, organizing, discussing, researching • Build (ing): selecting, developing, sharing, completing • Review (ing): refining, checking, updating, editing, testing, finalizing

  11. Student Learning ObjectivesComponents • Goal Statement – “big idea” of what the SLO is based on • Endurance – Learning has worth beyond the assessment • Leverage – Content has value across disciplines • Readiness – Provides knowledge/skills necessary for success at future levels of instruction • Performance Measures – Assessments used to measure student achievement • Performance Indicators – Articulated target for student achievement • Effectiveness Rating – Translation of number of students meeting performance Indicators • How many met target and what does that mean?

  12. Student Learning ObjectivesAssessment Literacy • Input vs. Output • When we think about how we are changing education today, we are moving from a system that focuses on inputs to one that focuses on outputs. In an input world, what we care about for integrity of curriculum is making sure that all of our teachers are giving children exactly the same thing. This is a Betty Crocker curriculum. Betty Crocker has some fantastic recipes and we want to make sure that the boxes of cake always produce the same outcome. That’s what education has been. You get a publisher and they say here are the resources, follow the instruction to the letter and that is Input Integrity. • Assessment changes all that. Assessment is about output integrity. • Did the kid learn what he needed to learn? How does that make it different? • When we think about outputs, we have to change all those input factors. Betty Crocker doesn’t help us; the recipe isn’t the guide. The assessment tells us where we need to add a little salt and where we need a little sugar, and where do we need to change what we’re making altogether. Formative assessment and summative assessment give us information about how successful we are, that we need to use in a different way to look at curriculum and instruction integrity, and build upon what we have done previously...adapting and changing in the name of improvement.

  13. Student Learning ObjectivesAssessment Literacy • ASSESSMENT – Foundation for measuring success • http://www.youtube.com/watch?v=iOcYfrZJWi8 • WEBB’S DOK – New version of Bloom’s Taxonomy • Color Laminated Chart • PDF Packet • http://vimeo.com/20998609

  14. Rigor is creating an environment in which each student is expected to learn at high levels, each student is supported so that he or she can learn at high levels, and each student demonstrates learning at high levels. -Barbara Blackburn, 2008 What is RIGOR? Rigor in the classroom

  15. Rigor can be accomplished by: • Increasing the complexity of thinking in… • Course content – learning progressions and appropriate leveled text for challenge • Instruction – activities promote critical thinking, communication building, applying integrated ideas, application of concepts, promoting responsibility • Assessment– aligned to instructional targets, engages with academic content, requires extended and elaborated responses.

  16. Bloom’s Taxonomy Old (1950s) New (1990s) HANDOUT: the laminated charts show you a comparison of BLOOM’s TAXONOMY with WEBB ’S DEPTH OF KNOWLEDGE.

  17. COMPARISON BLOOM’s KEY POINTS: • 6 levels • Different sources list different verbs • The same verbs appear as examples in more than one cognitive level • This overlap indicates that focusing ONLY on verbs to determine what is the level of cognitive demand is not fully adequate. WEBB’s KEY POINTS: • The DOK is NOT determined by the verb (Bloom’s) but by the context in which the verb is used and in the depth of thinking that is required. • Names 4 different ways students interact with content. • Each level is dependent upon how deeply students understand the content

  18. DOK is about what follows the verb... What comes after the verb is more important than the verb itself… “Analyze this sentence to decide if the commas have been used correctly” does not meet the criteria for high cognitive processing. The student who has been taught the rule for using commas is merely using the rule.

  19. Same Verb – 3 different DOK levels DOK 1- Describe three characteristics of metamorphic rocks. (Requires simple recall) DOK 2- Describe the difference between metamorphic and igneous rocks. (Requires cognitive processing to determine the differences in the two rock types) DOK 3- Describe a model that you might use to represent the relationships that exist within the rock cycle. (Requires deep understanding of rock cycle and a determination of how best to represent it)

  20. DOK is about intended outcome,…not difficulty DOK is a reference to the complexity of mental processing that must occur to answer a question, perform a task, or generate a product. • Adding is a mental process. • Knowing the rule for adding is the intended outcome that influences the DOK. • Once someone learns the “rule” of how to add, 4 + 4 is DOK 1 and is also easy. • Adding 4,678,895 + 9,578,885 is still a DOK 1 but may be more “difficult.”

  21. WEBB’S DOK RESOURCES • Online Search – tons of resources… • Laminated Charts – Webb’s vs. Bloom’s • Handout DOK #1– Levels Described • Handout DOK #2 – Subject Area Info • Handout DOK #3 – Question Stems • Activity: Question Analysis • Math – Trip to the Capital • ELA – Women Poem

  22. SLO Process ComponentsDESIGN • Thinking about what content standards to measure • Organizing standards and measures • Discussing with colleagues collective goals • Researching what is needed for a high quality SLO

  23. SLO Process Components BUILD • Selecting the performance measure(s) • Developing targets and expectations • Completing the template • Sharing the draft materials with other colleagues • Developing/Documenting performance task(s)

  24. SLO Process Components • REVIEW • Checking the drafted SLO (including the performance measures for quality • Refining measures and targets • Editing text and preparing discussion points/highlights for principal • Finalizing materials • Updating completed SLOs with performance data

  25. Design

  26. What is a Goal Statement? • Definition: • Narrative articulating the “big idea” upon which the SLO is built under which content standards are directly aligned. • Characteristics: • ENDURANCE: Encompasses the “enduring • understanding” of the standard…beyond the test • LEVERAGE: Central to the content area…but has value in other disciplines • READINESS: Foundational concepts for later subjects/courses … necessary to the next step Endurance Leverage Readiness

  27. Goal Statement Example • “Students will apply the concepts and the competencies of nutrition, eating habits, and safe food preparation techniques to overall health and wellness throughout the life cycle at individual, family and societal levels.”

  28. SLO Goal • (Template #1) • Goal Statement addresses: • WHAT the “big idea” is in the standards • Standards • HOW the skills and knowledge support future learning • Rationale Statement: • WHY the “big idea” is a central, enduring concept • http://pdesas.org/standard/PACore

  29. More Considerations for Goal Statements • Do you have previous data to help guide your goal? • What does your growth and achievement look like? • Is there a building/district-wide goal?

  30. Activity:Goal Statement (Template #1) • Within your team, choose a discipline in which you’d like to focus. Preferably, choose a discipline that is very familiar to you. • Complete “Template #1 Goal Statement” • We will post them for the entire group.

  31. Build

  32. TemplateSection 1

  33. Goal • Goal statement should articulate an appropriate “big idea”. http://pdesas.org/standard/PACore • Standards should be the appropriate Focus Standards supporting the goal. • Rationale statement should be reasons why the Goal statement and the aligned Standards address important concepts for this class/course.

  34. TemplateSection 2

  35. Performance Indicator Definition: a description of the expected level of student growth or achievement based on the performance measure ***Articulates Targets for each Performance Measure*** Answers two questions………. • Does the indicator define student success? • What is the specific measure linked to the indicator?

  36. Examples of Performance Indicator Targets • Students will achieve Advanced or Proficient on all four criteria of the Data Analysis Project rubric. • Students will score an average of 3 or better on five different constructed response questions regarding linear modeling according to the general description of scoring guidelines.(http://static.pdesas.org/Content/Documents/Keystone%20Scoring%20Guidelines%20-%20Algebra%20I.pdf) • Students will improve a minimum of 10% points from pre- to post-test for material in each semester. • Students will show “significant improvement” in the Domain of Measurement on the Classroom Diagnostic Tools Mathematics Grade 7 assessment from the first to the last administration.

  37. Performance Indicator – Focus student group A description of the expected level of achievement for each student in a subset of the SLO population (1F) based on the scoring tools used for each performance measure (4A). Subset populations can be identified through prior student achievement data or through content-specific pretest data.

  38. Examples of Performance Indicator Targets: Focused Student Group • Students who scored below the 30th percentile on their benchmark AIMSweb R-CBM probe will score above the 30th percentile by the end of the school year using the national norms. • Students who scored below a 2 on the pre-test will improve a minimum of one level on the post-test.

  39. SLO Design Coherency R A T I N G All Students Targeted Students

  40. Activity:Growth and Mastery • What assessments may be used as growth, mastery or both?

  41. What are the characteristics of a quality assessment? • Write (3). • Report out the summary from your table.

  42. Good assessments have…… • A specific and defined purpose • A mixture of question types • Items/tasks with appropriate DOK levels • Items/tasks that are Standards Aligned • A quality rubric • A standardized scoring method • Academic Rigor • A reasonable time limit for completion • An appropriate readability level • Multiple methods of student demonstration • Validity and reliability • Well-written directions and administration guidelines • Cut scores for performance categories

  43. Academic Rigor • Standards-Aligned • Developmentally Appropriate • Focused on Higher-Order Thinking

  44. Weighting, Linking, or Otherwise • Standard You may consider each Performance Indicator equal in importance. • Linked You may link multiple Performance Indicators, if you like. Do this for “pass before moving on” assessments. 3. Weighted You may weight multiple Performance Indicators, if you like. Do this when you believe one or more PI’s are more complex or more important than others.

  45. Standard Scenario

  46. Weighting Scenario • Physics Class with (3) PI targets: Total Score = 72.5%

  47. TemplateSection 3

  48. Goal-Indicator-Measure

More Related