1 / 49

Student Learning Objectives in Teacher Evaluation: Support Tools for Peer Observers

Student Learning Objectives in Teacher Evaluation: Support Tools for Peer Observers. Objectives To provide peer observers with: criteria to review quality of SLO’s/IAGDs, t ools to use when meeting with teachers about student progress.

damara
Télécharger la présentation

Student Learning Objectives in Teacher Evaluation: Support Tools for Peer Observers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Learning Objectives in Teacher Evaluation: Support Tools for Peer Observers Objectives To provide peer observers with: criteria to review quality of SLO’s/IAGDs, tools to use when meeting with teachers about student progress

  2. Part I: Ensuring SLO QualityHelping Your Teachers Develop Student Learning Objectives Example Mrs. Smith, the 6th grade Science teacher to whom you have been assigned as a peer observer, asks you to review her Student Learning Objectives before she meets with her building principal OR her principal has asked you to work together to revise/improve the SLO that the teacher submitted. QUESTION #1: What criteria will you use to assist her in determining if her SLO meets expectations?

  3. Step 1: Review Elements of SLOs with your Peer ObserveeYou ask Mrs. Smith to read slides 4 through 16 to ensure that the two of you have a shared understanding of SLOs

  4. Elements of an SLO • SLO Focus Statement- • Baseline – Trend Data • Student Population • Standards and Learning Content • Interval of Instruction • Assessments • Indicators of Academic Growth and Development (IAGDs)/Growth Targets • Instructional Strategies

  5. SLO Process • 1- Review Data (district, building and baseline, trend, historical student data ); • 2- Write the SLO focus statement- How does the data support the SLO? • 3- Select the target population- Why is this target group included? • 4- Identify the standards connected to the learning content- what are the big and core ideas for which the baseline data indicates a need?

  6. SLO Process • 5. Specify the interval of instruction- what is the time period that instruction for learning content will occur? • 6. Determine how you will measure the outcome of the SLO- what will assess student growth? • 7. Set the IAGDs- what are the quantitative targets that will demonstrate achievement of the SLO? • 8. Identify instructional strategies and supports needed to achieve the SLO.

  7. Checklist of Expectations for SLOs: Some Reminders for Peer Observers • An individual Student Learning Objective must address a central purpose of the teacher’s assignment and pertain to a large proportion of the students on the roster for the course or subject area with which the objective is aligned • Tiered targets are set in the IAGDs according to students’ starting points because students may begin at varying levels of preparedness. • Students who begin at an instructional interval below grade-level proficiency should be expected to reduce the gap between their knowledge and grade-level proficiency by the end of the interval of instruction.

  8. Three Criteria of SLOsSome Reminders for Evaluators • 1. High Expectations: Does the objective reflect high expectations for student improvement and aim for mastery of content or skill development? • 2. Rigor of Target: Does the target reflect both greater depth of knowledge and complexity of thinking required for success? • 3. Baseline/Trend Data: Will the evidence source provide the information needed to support established targets?

  9. Priority of ContentWhat Teachers Need to Know • High quality SLOs include strong justifications for why the goal is important and achievable for this group of students. • Rationales should draw upon assessment, baseline and trend data, student outcomes, and curriculum standards and should be aligned to broader school and district goals. • The SLO should cover the content, skills, and specific standards to which the SLO is aligned. • SLOs should be broad enough to represent the most important learning or overarching skills, but narrow enough to be measured.

  10. Rigor of SLO: What Teachers Need to Know • The target for student growth should reflect high expectations for student achievement that are developmentally appropriate and rigorous yet attainable. • The target is anchored in baseline data, including historical data (i.e., district, school and student data) and multiple measures, if possible. • If appropriate, the SLO differentiates targets for individuals or groups of students based on baseline data so that all targets are rigorous yet attainable. Rigor is determined by past performance of students, a year’s growth, percentage of students who attain the target or other measures. • Rationale provided by teacher shows that target is rigorous because it is based on data and exceeds past performance of students as appropriate, or demonstrates a year’s worth of growth or other important outcomes.

  11. Quality of Evidence What Teachers Need to Know About IAGDs • The assessment should align to both students’ learning objectives and to the appropriate grade or content-specific standards. It should cover the key subject and grade-level content standards and curriculum that will be taught during the interval of instruction. When examining assessments for alignment, educators should look for the following: • Items on the assessment should cover all key subject/grade-level content standards. • No items on the assessment should cover standards that the course does not address. • Where possible, the number of assessment items should mirror the distribution of teaching time devoted to concepts or the curriculum focus. • For example, if a foreign language teacher devotes almost equal amounts of time to developing students’ reading comprehension, listening comprehension, oral communication, and written communication skills, he or she should not use an assessment that devotes 90 percent of the assessment to reading comprehension. Instead, the distribution of the test should mirror instruction, meaning that about a quarter of the test should focus on each of the four skills listed above.

  12. Quality of EvidenceWhat Teachers Need to Know (cont’d) • The items or tasks should match the full range of cognitive thinking required during the course. • For example, if the main foci of the mathematics content standards are solving word problems and explaining reasoning, some questions or items on an assessment should require students to solve word problems and explain how they arrived at their answers. The assessment should require students to engage in higher-order thinking where appropriate. These items or tasks may require students to use reasoning, provide evidence, make connections between subjects or topics, critique, or analyze.

  13. What to Include in Your SLO

  14. What to Include in Your SLO p.2

  15. What to Include in Your SLO p.3

  16. What to Include in Your IAGD p.1

  17. Step 2: Use Rubric and Guiding Questions to Analyze SLO and Revise as Needed Now you and Mrs. Smith sit down together to review the SLO that she created. Use the information (slides 13-16) for the initial analysis. As well as the rubric ( slides 18 – 22) Refer to slides 23-26 for helpful questions to guide your analysis.

  18. SLO RUBRIC

  19. SLO RUBRIC

  20. SLO RUBRIC

  21. SLO RUBRIC

  22. SLO RUBRIC

  23. Examining an SLO: Questions to Ask About Expectations • Is the objective statement focused on the right content and skills? • Is the objective statement the appropriate scope/grain-size? • Is the objective statement aligned to state and/or national standards? • Is this objective statement aligned to school and/or district level priorities (where applicable)?

  24. Examining an SLO: Questions to Ask About Rigor of Target • Is the target(s) aligned with expectations for academic growth or mastery within the interval of instruction? • What data source(s) informed the target that was set? • Is the target(s) rigorous, yet attainable for all students? • Will students be “on track” and/or reduce gaps in achievement if they reach the target(s)?

  25. Examining an SLO: Questions to Ask AboutBaseline/Trend Data • Does the assessment measure all of the identified content/skills included in the Student Learning Objective? • Does the assessment provide the specific data needed to determine if the objective was met? • Does the assessment include an adequate number of items or points to measure the content?

  26. Examining an SLO: More Questions to Ask AboutBaseline/Trend Data • Does the assessment includes items or tasks that represent a variety of Depth of Knowledge levels? • Is the assessment accompanied by a rubric or scoring guide? • Do sources of evidence overlap and provide multiple measures of the same standards? • Are sources of evidence supplementing each other to capture the full range of standards addressed by the Student Learning Objective? • Can the assessment be compared across classrooms and schools?

  27. Case Study: Using baseline data to generate SMART goals. You and Mrs. Smith review her goal statement (below), and rate the statement as “acceptable”. By the end of SY 13-14, all [of my] 6th grade students will be able to a) use the scientific method to plan and conduct experiments and b) analyze scientific texts and craft written responses supported by textual evidence in the four Areas of Inquiry: Formulating Questions & Hypothesizing, Planning & Critiquing Investigations, Conducting Investigations, and Developing and Evaluating Explanations. Next, she shares the data on which she is basing her decision for establishing her Indicators of Academic Growth and Development (see slide 21).

  28. An Excerpt from Mrs. Smith’s 6th Grade Science SLO

  29. Mrs. Smith’s First Draft IAGD • Her initial IAGD was framed as, “80% of students who scored below a 70 on the pre-test will increase their scores by 20 points on the post-test in June, 2014”. • You realize that this target does not address all students, and is not designed to decrease achievement gaps among the lowest and higher performers. • How might your conversation go?

  30. A Conversation with Mrs. Smith • You: “I see that you want to ensure that the 94 students who scored below 70 will improve significantly throughout the course of the year. What about the rest of the students--Do you also want to reduce the gaps in achievement among other groups?” • Mrs. Smith: “Well, yes; of course…. But I don’t think it’s realistic for me to expect all students to reach a goal of 80% on the post-test—especially not for the 16 who scored below 30. And the ten who scored between 70 and 87 will probably improve anyway.” • You: “O.K., then. So I’m hearing you say that it’s realistic to assume that everyone will improve. Does it make sense to set differentiated growth targets that reflect varying rates of improvement? For example, could you expect most of the 16 students who scored below 30 to achieve a score of… say… 60 on the post-test? ” • Mrs. Smith: “It would be a challenge, but I know that most of them are either Special Education or ELL students, so I could consult with their teachers about strategies I can use to enhance their understanding… So, yes; I think I could make sure that most of them would score a 60 or better.” • You: “And those high performers—can you sett targets that represent improvement for them?” • Mrs. Smith: “Sure. I think it’s safe to assume that most of them will score at least a 90.” After more discussion, you and Mrs. Smith settle on differentiated targets for each of her baseline groups. (see next slide.)

  31. Sample Differentiated IAGD

  32. Part II: Conversations with Teachers About Student Progress Example Six weeks into the school year, you and Mr. Jones, a fourth grade teacher to whom you have been assigned, are meeting to examine the evidence of students performance relevant to his SLO, which is framed as “Students will demonstrate proficiency in selecting and using relevant information from the text in order to summarize events and/or ideas in the text.” His IAGD reads: At least 85% of my 22 students will attain goal on the Grade 4 CMT in March, 2014.” He is using the DRA2 to assess students’ progress toward his goal. He has color-coded students (see excerpt; next slide) into three categories—1’s (orange) perform well below goal; 2’s (yellow) perform below goal; 3’s (no color) perform at/above goal on district assessment. You notice that 14 of his 22 students fall within the “2” range of scores on total comprehension. Also, only two of those 14 students perform at goal on “scaffolded summarizing”, and two of them are performing significantly below goal on that skill. How might your conversation go?

  33. A Conversation with Mr. Jones • You: [Praise] You’ve done a great job of laying out the data. It’s easy to see at a glance who how your students are performing. [Analysis]: “So… what is the data telling you?” • Mr. Jones: “Clearly, most of my students need more work on summarizing non-fiction texts if I’m to reachmy SLO by March. I have a number of students who are still struggling with fluencyand literal comprehension .” • You: [Realignment Comments: ] “O.K., let’s zero in on those weaker performers. You mentioned struggles with fluency and with literal comprehension. Did you notice any correlation between these issues and performance on the scaffolded summary items? Were there any trends that you noticed and/or connections that you noticed?” • Mr. Jones: “Actually, no. I just looked at the scores on the summaries and got nervous. But now I see that students who were weak in either skill also performed poorly on the summarizing portion of the assessment.” • You: “Let’s dig a little deeper to see if we can find some connections that will help you group your students into strategy groups.”[Data-Focusing Comments:] “Let’s look at the breakdown of Oral Reading and the set of skills relevant to basic comprehension. We might find some connections there that will help you group students according to their differentiated learning needs.” [see next slide]

  34. Oral Reading/ Comprehension Data from Mr. Jones’ DRA2 Results

  35. A Conversation with Mr. Jones • You: [Continue probing] “What obstacles to reading comprehension, and more specifically, to summarizing, are revealed in this data?” • Mr. Jones: “It looks like all students whose rate of reading is poor also have trouble summarizing. It also looks like there’s a connection between reading with expression and summarizing, but I can’t tell if it’s just a correlative relationship or it’s actually causal… I’d need to know more. I do know that rate of reading impacts a student’s ability to understand text, especially when the text is complex. It’s reasonable to assume that students who read without expression are concentrating more on getting the words right than they are on understanding what the text is saying.” • You: “It’s certainly worth exploring. Do you see any implications for instruction and for grouping your students?” • Mr. Jones: “It looks like most of the class would benefit from reading with expression…I wonder if there’s a way to link that to summarizing. I’ll also need to work with the group of students who scored a “1” on reading rate…as a group, they are performing below goal in most areas. Most of them weren’t using text features for support…so that gives me some ideas....”

  36. A Conversation with Mr. Jones • You: [Focus on Action Steps] “O.K., now you’re starting to develop an instructional action plan. Next week, let’s meet to go over your action plan for instruction and progress monitoring. It should incorporate what we’ve uncovered using this data, and anything else you discover through your informal assessments. Meanwhile, I’m going to leave you with some questions to address as you develop your plan: • What are the prerequisite skills the students need to be able to do summarize text? • Which need explicit and/or intensive instruction?? • How will you sequence your instruction to ensure that skills build on one another? • What strategies will you use for students who struggle? • How will you check for understanding and assess mastery? Are there alternatives to the DRA2 that you might use to monitor student growth?” When you and Mr. Jones meet the following week, Mr. Jones presents his action plan for instruction and monitoring student progress (see next 3 slides). He worked with the literacy coordinator to develop the plan, and acknowledges that he needs more professional development on effective differentiation to support struggling readers.

  37. Mr. Jones’ Plan for Monitoring Students’ Progress Toward SLO

  38. Mr. Jones’ Plan for Monitoring Students’ Progress Toward SLO

  39. Mr. Jones’ Plan for Monitoring Students’ Progress Toward SLO

  40. Mr. Jones’ Plan for Monitoring Students’ Progress Toward SLO

  41. Data Driven Conversations • “Discussion about the use of data “is not intended to blame anyone; instead it is aimed at understanding a system that avoids using precise information that can guide and inform better practice. Goal-setting that uses data to monitor progress can be a threatening endeavor. Preparation and ongoing training have often failed to provide staff with the ability or confidence to believe they can succeed. This insecurity hampers every teacher and administrator including our most talented and industrious ones.” (Mike Schmoker, 1999.) • Schools that operate as professional learning communities use formative assessments on a frequent basis to ask, “Are students learning, and what steps must we take to address the needs of those who have not learned?” • Source: NSDC Coaches Academy http://www.co-csdc.org/news/pdf/DataDrivenHandout.pdf

  42. Appendix: Resources to Use for Your Data-Informed Conversations and Decisions The information that follows includes questions and question stems to use that will open up collegial conversations about the results of student performance.

  43. Questions for C.E. to Ask Teacher: o What questions does the data raise for you as a classroom teacher? o What patterns do you notice in the data? Successes? Areas of need? o What are the patterns in strands or standards across all of your classes? o What do you notice about subgroup performance? What differences, if any exist among all the subgroups? o What instructional strategies or teaching ideas do you have to address areas of need? o What will you do to increase the level of student proficiency in the targeted areas? o When will you assess student learning? What common assessment will you use?

  44. More Questions for C.E. to Ask : Areas of need/challenge/difficulty What are the students’ greatest areas of need? What do you do in your own classes to address these? What processes in programs/curriculum currently address these? Does the student data align with/differ from your own perceptions of student needs? What support do you need to meet students’ needs? Strategies or approaches to help students learn Which approaches do students find most helpful for their learning? Which of these do you do - consciously or incidentally? Which of these are approaches students prefer because it means the teacher does more of the work? What do you do to make students more responsible for their own learning? Which of these could be addressed through subjects/classes? Which of these approaches can be done on the spot? Which of these need preparation/further skill development on your part?

  45. Question/Conversation Starters Interpretation • What you are describing could mean… • Could it be that what you are saying is… • Is it possible that… Mediational • What criteria do you use to… • What might happen if… • How would it look… • What is the impact of … on students… • How do you decide… Instructional • Would you like more information; to review some options; some resources… • A couple of things to keep in mind are… • Research seems to indicate… • Sometimes it is helpful if…

  46. More Question/Conversation Starters • Summarizing • • You have stated that your goal is… • • Let’s review the key points in our discussion… • • Tell me your next steps… • • So this is your homework… • Transformational • • Let’s try a role-play… • • Ground that assessment for me…could you make a different assessment… • • How could we turn that rut story into a river story… • • What new “way of being” are you willing to try out… • So this is your homework… • Transformational • • Let’s try a role-play… • • Ground that assessment for me…could you make a different assessment… • • How could we turn that rut story into a river story… • • What new “way of being” are you willing to try out…

  47. Progress Monitoring Questions for Teachers to Ask Themselves • What resources do I have available to monitor student growth that were not used in my SLO assessment? • How can I spiral the content from my Learning Objective into lessons throughout the year? • How can I support my team and ensure that we all are monitoring student progress? • What professional development would help me achieve my goals? • What additional resources would help me achieve my goals? • What are other teachers doing to incorporate their Learning Objective into lessons throughout the year?

  48. Practice What would you do?? You can now practice using this information by opening the PowerPoint in the Self-Directed Task Folder and read and reflect on the scenario described.

More Related