1 / 70

Data-Driven Instruction Mid-Year Follow-up Workshop

Data-Driven Instruction Mid-Year Follow-up Workshop. Paul Bambrick-Santoyo. APOLLO 13. In the face of great adversity, how did the Houston team respond? What key statements and actions helped the team save Apollo 13?. GROUND RULES. EVERY CHALLENGE IS REAL & UNIQUE

vanya
Télécharger la présentation

Data-Driven Instruction Mid-Year Follow-up Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data-Driven Instruction Mid-Year Follow-up Workshop Paul Bambrick-Santoyo

  2. APOLLO 13 • In the face of great adversity, how did the Houston team respond? • What key statements and actions helped the team save Apollo 13?

  3. GROUND RULES • EVERY CHALLENGE IS REAL & UNIQUE • THE EXPERTS ARE IN THE ROOM • WE ABSOLUTELY CAN FIND SOLUTIONS TO ANY PROBLEM • NO ISLANDS ALLOWED • BOTTOM LINE: IMPROVE ACHIEVEMENT • ACT NOW TO BE READY FOR TOMORROW

  4. OFF-RAMPS ON THE ACHIEVEMENT HIGHWAY • Unaligned interim assessments • No structured time in school day • Infrequent interim assessments • Externally analyzed • No analysis done by teachers • Re-teaching is stabbing in the dark • No follow-up: no action

  5. THE FOUR KEYS: • DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: • ASSESSMENTS • ANALYSIS • ACTION • in a Data-driven CULTURE

  6. ASSESSMENTS: • PRINCIPLES FOR EFFECTIVE ASSESSMENTS: • COMMON INTERIM: • At least quarterly • Common across all teachers of the same grade level • DEFINE THE STANDARDS—ALIGNED TO: • To state test (format, content, & length) • To instructional sequence (curriculum) • To college-ready expectations

  7. ASSESSMENTS: • PRINICIPLES FOR EFFECTIVE ASSESSMENTS: • REASSESSES: • Standards that appear on the first interim assessment appear again on subsequent interim assessments • WRONG ANSWERS: • Illuminate misunderstanding • TRANSPARENT: • Teachers see the assessments in advance

  8. THE FOUR KEYS: • DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: • ASSESSMENTS (Interim, Aligned, Reassess, Transparent) • ANALYSIS • ACTION • in a Data-driven CULTURE

  9. ANALYSIS: • IMMEDIATE: Ideal 48 hrs, max 1 wk turnaround • BOTTOM LINE: Includes analysis at question level, standards level and overall—how well did the students do as a whole • TEST-IN-HAND analysis: Teacher & instructional leader together • TEACHER-OWNED analysis • DEEP: Moves beyond “what” to “why”

  10. THE FOUR KEYS: • ASSESSMENTS (Aligned, Interim, Reassess, Transparent) • ANALYSIS(Quick, Bottom line, Teacher-owned, Test-in-hand, Deep) • ACTIONin a Data-driven CULTURE

  11. ACTION: • PLAN new lessons based on data analysis • ACTION PLAN: Implement what you plan (dates, times, standards & specific strategies) • LESSON PLANS: Observe changes in lesson plans • ACCOUNTABILITY: Observe changes classroom observations, in-class assessments • ENGAGED STUDENTS: Know end goal, how they did, and what actions they’re taking to improve

  12. THE FOUR KEYS: • ASSESSMENTS (Aligned, Interim, Reassess, Transparent) • ANALYSIS(Quick, Bottom line, Teacher-owned, Test-in-hand, Deep) • ACTION(Action Plan, Accountability, Engaged) • in a Data-driven CULTURE

  13. DATA-DRIVEN CULTURE: • VISION: Established by leaders and repeated relentless • “REAL” LEADERSHIP TEAM: Trained and highly active • CALENDAR: Calendar in advance with built-in time for assessments, analysis & action • PROFESSIONAL DEVELOPMENT: Aligned

  14. THE FOUR KEYS: • ASSESSMENTS (Aligned, Interim, Reassess, Transparent) • ANALYSIS(Quick, Bottom line, Teacher-owned, Test-in-hand, Deep) • ACTION(Action Plan, Accountability, Engaged) • in a Data-driven CULTURE(Vision, Leadership, Calendar, PD)

  15. Phases of Data-Driven Instruction & Interim Assessments Adapted, Research by Camden County, GA Public School District

  16. PHASE 1 • IGNORANCE, CONFUSION, OVERLOAD: • “I don’t understand what we’re doing.” • “This is too much! How am I really supposed to use all this?” • “All this analysis! What’s wrong with just grading the old-fashioned way?” • “Uh? Interim assessments? What are those?”

  17. PHASE 2 • FEELING INADEQUATE & DISTRUSTFUL: • “How can two questions on a test possible establish mastery? These tests can never measure what I know about my students’ learning.” • “This idea of an assessment is terrible! We don’t teach like that format! We teach it this way.”

  18. PHASE 3 • CHALLENGING THE TEST: • “Question #26 is a poor question. Answer “b” is a trick answer.” • “Question #11 is too hard. We need to make it easier.” • “The kids made silly mistakes because of the pressure of this pointless test. They know this stuff.” • Undertone: “I’ve never looked at a test item before, but I’m going to now if you’re going to hold me accountable.”

  19. PHASE 4 • ANALYTICAL but SUPERFICIAL: • “They just don’t do well on word problems. I just need to do more word problems.” • “They just don’t read enough. I’ll get them to read more.”

  20. PHASE 5 • LOOKING FOR CAUSES, BUT NO ACTION: • “These wrong answers tell me that they don’t know the difference between a summary and a theme.” • “I always taught grammar in isolation, and this test asked for it in a more authentic form.” • The problem with solving algebraic equations for them was actually the inability to subtract negative integers.”

  21. PHASE 6 • CHANGING TEACHING PRACTICES: • Teachers follow through on analysis • Lesson plans reflect spiraling, re-teaching, etc. • Teachers look for best practices outside of their own classroom

  22. Phases of Interim Assessment Implementation • Phase 1: Ignorance, confusion and overload—“This is too much!” • Phase 2: Feeling inadequate and distrustful—“This test is terrible!” • Phase 3: Challenging the test—“Question 26 was too hard—a trick question.” • Phase 4: Analytical but superficial • Phase 5: Looking for causes, no action • Phase 6: Changing teaching practices

  23. Self-Evaluation:Implementation of Data-Driven Instruction

  24. The Big Balloons:Challenges We Face

  25. CHALLENGES: • NEVER LEFT THE STATION: I’m overwhelmed and/or frustrated; where do I begin? • NO TEACHER BUY-IN: How do I get teachers/leaders invested? • SURFACE TEACHER BUY-IN, LATER RESISTANCE: I thought they got it, but now they’re resisting. What do I do? • UNSUPPORTIVE MENTOR PRINCIPAL:What real results can I accomplish in this context? • STUDENT/PARENT BUY-IN: How do you get them invested in this process? • NO LEADERSHIP TEAM BUY-IN: • How do I deal with a leadership team that doesn’t believe in the power of data-driven instruction?

  26. CHALLENGES: • RUNNING TEAM MEETINGS: How do I limit the impact of a negative teacher who’s trying to undermine the process? How do I bring experienced and novice teachers together effectively? • CITY/SCHOOL ASSESSMENTS ARE POOR: How do I adapt to mandatory weak assessments that aren’t aligned to state test? • CREATING ASSESSMENTS: • How do I access quality interim assessment material? How do I build assessments for other subjects or that align to college readiness? • ANALYZING RESULTS: What effective templates are working? How do I alleviate time drain? Where do I begin when most standards are deficient?

  27. CHALLENGES: • TARGETED RE-TEACH—HOW?:What are effective strategies to reach mastery? How do I do differentiated instruction with teachers who can’t manage their class or haven’t done small groups before? • TIME:How do I creatively manage our school schedule to implement all aspects of data-driven instruction? How do I gain time for analysis/action if I cannot extend their day? • COMPETING INTERESTS: What are ways to implement data-driven instruction when there are so many other initiatives in the school? • “GOOD SCHOOLS/TEACHERS” PROBLEM: What are some strategies for getting staff on board who already have “good scores”? How do I create urgency? • ADJUSTING CALENDAR: How can I adjust the original NLNS data-driven calendar when I’m really just starting now? How do I work around a poor school calendar that isn’t what my school needs?

  28. Results Meeting Protocol Summer Foundations Review

  29. 50 MIN TOTAL ACTION: RESULTS MEETING • IDENTIFY ROLES: Timer, facilitator, recorder (2 min) • IDENTIFY OBJECTIVE to focus on (2 min or given) • WHAT WORKED SO FAR (5 min) • [Or: What teaching strategies did you try so far] • CHIEF CHALLENGES (5 min) • BRAINSTORM proposed solutions (10 min) • [See protocol on next page] • REFLECTION: Feasibility of each idea (5 min) • CONSENSUS around best actions (15 min) • [See protocol on next page] • PUT IN CALENDAR: When will the tasks happen? When will the teaching happen? (10 min)

  30. RESULTS MEETING STRUCTURE:PROTOCOLS FOR BRAINSTORMING/CONSENSUS • PROTOCOL FOR BRAINSTORMING: • Go in order around the circle: each person has 30 seconds to share a proposal. • If you don’t have an idea, say “Pass.” • No judgments should be made; if you like the idea, when it’s your turn simply say, “I would like to add to that idea by…” • Even if 4-5 people pass in a row, keep going for the full brainstorming time. • PROTOCOL FOR REFLECTION: • 1 minute—silent personal/individual reflection on the list: what is doable and what isn’t for each person. • Go in order around the circle once: Depending on size of group each person has 30-60 seconds to share their reflections. • If a person doesn’t have a thought to share, say “Pass” and come back to that person later. • No judgments should be made.

  31. RESULTS MEETING STRUCTURE:PROTOCOLS FOR BRAINSTORMING/CONSENSUS • PROTOCOL FOR CONSENSUS/ACTION PLAN: • ID key actions from brainstorming that everyone will agree to implement • Make actions as specific as possible within the limited time • ID key student/teacher guides or tasks needed to be done to be ready to teach—ID who will do each task • Spend remaining time developing concrete elements of lesson plan: • Do Now’s • Teacher guides (e.g., what questions to ask the students or how to structure the activity) • Student guides • HW, etc. • NOTE: At least one person (if not two) should be recording • everything electronically to send to the whole group

  32. Adjusted Results Meeting: Best Practice Sharing

  33. 1 HR TOTAL BEST PRACTICES SHARING: • Divide into subgroups (if large enough group): ID specific challenges each group will address (2 min) • SUBGROUPS: • Identify roles: Timer, facilitator, recorder, electronic copier (2 min) • Share best practices Ask follow-up questions to understand success (3-5 min each; 20 min) • Imagine challenges others are having: Contemplate if your experience addresses those challenges (5 min) • Consensus around best actions to “publish”: Make plans for what you can do in 30 min (10 min) • Publish: Produce dazzling display of best practices and put on the wall around your balloon (20 min) • Select a representative: Pick one expert to stay as guide

  34. Adjusted Results Meeting: Finding Solutions to Our Challenges

  35. FACING OUR CHALLENGES: FINDING SOLUTIONS 50 MIN TOTAL • GROUP: • Identify roles (2 min) • Read through artifacts on the wall (10 min) • Share the challenges you are having (10 min) • Identify the best practices that would be useful for your context (5 min) • Brainstorm how to make those practices 100% applicable to your school building (5 min) • INDIVIDUAL: • Write concrete plan for your school upon return from workshop (18 min)

  36. Popping the Balloons:SOLVING All Challenges

  37. Analysis & Action:The Role of Leadership

  38. THE FOUR KEYS: • ASSESSMENTS (Aligned, Interim, Reassess, Transparent) • ANALYSIS(Quick, Bottom line, Teacher-owned, Test-in-hand, Deep) • ACTION(Action Plan, Accountability, Engaged) • in a Data-driven CULTURE(Vision, Leadership, Calendar, PD)

  39. Effective Analysis Meetings Video Case Studies

  40. OBSERVER #1: ANALYZE TEACHER’S ROLE • IDENTIFY TEACHER LANGUAGE/ANALYSIS/ACTION: • What does the teacher say? • What preparation did they do for the meeting? • What was the quality of her analysis and what do you infer about the quality of the follow-up action taken with her students?

  41. OBSERVER #2: ANALYZE PRINCIPAL’S ROLE • GUIDING QUESTIONS—PRINCIPAL ANALYSIS GROUP: • What does the principal say? • How does he respond to teacher comments (and to what does he choose NOT to respond)? • What are the tips you could take about leading analysis meetings?

  42. OBSERVER #3: ANALYZE SYSTEMS & PREPARATION • GUIDING QUESTIONS FOR SYSTEMS ANALYSIS GROUP: • What data-driven systems are in place at the school that are apparent in this meeting? • What preparation did the principal do to make this meeting effective? • Are there additional systems or preparation that could make the meeting more effective?

  43. Addressing Resistance in Analysis Meetings Role Plays

  44. PRECURSORS TO EFFECTIVE ANALYSIS MTGS: BEFORE GIVING INTERIM ASSESSMENT • 6 WEEKS PRIOR TO INTERIM ASSESSMENTS: Teachers review assessment and plan toward the rigor of those assessments(TRANSPARENCY) • A FEW WEEKS PRIOR: Teachers predict performance on each assessment question: a) confident they’ll get it right; b) not sure; c) no way they’ll get it right(TEST-IN-HAND, TEACHER-OWNED) • PD (timing flexible): Teachers receive model of how to do assessment analysis and complete action plan, and they see model of effective & ineffective analysis meetings(PROF DEVT, DEEP)

  45. PRECURSORS TO EFFECTIVE ANALYSIS MTGS: AFTER INTERIM ASSESSMENT GIVEN • TEACHER ANALYSIS: Teachers do analysis of results prior to meeting, answering the fundamental question: WHY did the students not learn it? (TEACHER-OWNED, DEEP) • TEACHER ACTION PLAN: Teachers complete an action plan (ACTION PLAN, ACCOUNTABILITY) • LEADERSHIP ANALYSIS: Leader analyzes teacher results in preparation for the meeting (LEADERSHIP) • REVIEW OF TEACHER PLAN: Leader collects teacher action plan/analysis ahead of time to see if it looks acceptable (LEADERSHIP, ACCOUNTABILITY) • CONTENT EXPERTISE: Leader has a plan ready to access content experts if the problems are beyond own expertise (PROF DEVT)

  46. TIPS FOR EFFECTIVE ANALYSIS MEETINGS: • Let the data do the talking • Let the teacher do the talking (or get them to!) • Always go back to the test and back to specific questions • Don’t fight the battles on ideological lines (you’re going to lose) • There’s a difference between the first assessment and the third • You’ve got to know the data yourself to have an effective meeting • Make sure it’s connected to a concrete plan that you can verify

  47. ANALYSIS MEETING HELPFUL PHRASES: • HELPFUL STARTERS FOR ANALYSIS MEETINGS: • “So…what’s the data telling you?” • “Congratulations on the improvement from last time in x area! You must be really proud of their growth here.” • “So the _____ [paraphrase their frustration: the test was hard, the students were difficult, etc.]? I’m sorry to hear that. So where should we begin with our action plan moving forward?”

  48. ANALYSIS MEETING HELPFUL PHRASES: • DATA-FOCUSING FOR ANALYSIS MEETINGS: • “So let’s look at question 18…..Why do you think they got it wrong?” • “You know, I thought it might be a silly mistake, but what surprised me is that they did really well on questions x & y. Why do you think they did so well on these questions and yet not on your original question?" • “Let’s look at question 11. What did the students need to be able to do to answer that question effectively? Is this more than they are able to do with you in your class?” • [When new ideas occur or deeper analysis is done at the meeting than what teacher did previously] “So let’s re-visit the action plan you created and see how we can incorporate these additional ideas.”

More Related