Download
using observation to analyze effectiveness n.
Skip this Video
Loading SlideShow in 5 Seconds..
Using Observation to analyze Effectiveness PowerPoint Presentation
Download Presentation
Using Observation to analyze Effectiveness

Using Observation to analyze Effectiveness

147 Vues Download Presentation
Télécharger la présentation

Using Observation to analyze Effectiveness

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Using Observation to analyze Effectiveness Marybeth Flachbart, Ed.D mflachbart@neuhaus.org

  2. www.neuhaus.org

  3. Objectives Review research Clarify types of walk-throughs Identify areas of need through observational data

  4. Resources Used:Leading for Instructional improvement, Fink, Markholt, 2011Visible Learning, Hattie, 2009 Visible Learning for Teachers, Hattie, 2012

  5. Leading for Instructional ImprovementStephen Fink, Anneke Markholt

  6. VISBILE Learning: A Synthesis Of Over 800 Meta-Analyses Relating to Achievement John Hattie, Professor of Education and Director of the Visible Learning Labs, University of Auckland

  7. Effect Size Moves us Beyond did it work • Effect size is a way of quantifying the size of the difference between two groups. • How well does it work? • Generally used in meta-analysis for combining and comparing estimates from different studies.

  8. Basis of Hattie’s Conclusion 800+ meta-analysis 50,000 studies, and 200+ million students

  9. Hattie’s Range of Effect Size .07

  10. VISBILE Learning: A Synthesis Of Over 800 Meta-Analyses Relating to Achievement John Hattie, Professor of Education and Director of the Visible Learning Labs, University of Auckland

  11. Raise your hand if you’ve read any or all of these books

  12. If you have read any of the books: Which one(s)? What did you think of it? Please provide answers in the comment box

  13. Dispelling the Widget Myth

  14. THE Widget Effect, New Teacher Project, 2009 • Surveyed 14 large American School Districts • 98% of teachers were evaluated as “satisfactory” “Based on such findings, many have characterized classroom observation as a hopelessly flawed approach to assessing teacher effectiveness. “ http://widgeteffect.org/

  15. Can classroom observations identify practices that raise achievement? http://educationnext.org/evaluating-teacher-effectiveness/ YES!

  16. Defining Classroom Management Room setup Access to materials Transition time Procedures Grouping Climate

  17. Effect Size:Classroom Behavioral .80Classroom Management .52

  18. Defining Instructional Practice Link to standards Use of questioning Teacher led vs. student led Cognitive demand of selected activities Differentiation Assessment practices BICS vs. CALP

  19. Effect Size:Formative assessment .90micro Teaching .88Teacher Clarity .75reciprocal teaching .74Meta-cognitive strategies .69Cooperative Learning .59questioning .64

  20. Which is easier to observe, classroom management or instructional practices? Respond in comment box

  21. Gathering Data (Fink, Markholt) Types of walk-throughs Learning walk Goal setting Implementation Supervisory Other

  22. Your Practice Which types of walk-through do you do most frequently? Why? Learning walk Goal setting Implementation Supervisory Other

  23. Observational data is not evaluative! “Stick to the facts m’am, just the facts.”

  24. A Profession in Search of a Practice City and Elmore 2009

  25. Developing the discipline of seeing Seeing is a discipline It is like a muscle – it gets stronger with repetition Foundation of Instructional Rounds

  26. Observing Classroom Practice Using the Observation Protocol collect data that is: • Descriptive not evaluative- just the facts! • Specific • About instructional core • Related to problem of practice (or data needed to be collected)

  27. Specificity of Evidence “Teacher introduced the concept of fractions and students applied it in a hands-on activity.” vs. “Prompt for student essays: “What role did symbolism play in foreshadowing the main character’s dilemma?”

  28. Description vs. JudgmentWith Without “Student 1 asked student 2: ‘what are we supposed to write down?’ Student 2 said, ‘I don’t know.’” “The teacher read from a book that was not at the appropriate level for the class.” “Teacher introduced the concept of fractions and students applied it in a hands-on activity.”

  29. Description vs. Judgment “Students followed directions in the text to make circuit boards.” “There was too much time on discussion, not enough time on individual work.” “The students conducted a sophisticated lab experiment.”

  30. When observing, what do you see? • What is the teacher doing and saying? • What are the students doing and saying? • What is the task? 32

  31. Learning Walk Possible data points: Percentage of teacher vs. student talk Formative assessment practices Standard clear to students Meta-cognitive strategies Number of questions (types of questions) Who is doing the questioning?

  32. Learning Walk continued Amount of time spent reading/writing/researching Cognitive demand of student work Academic Vocabulary use Level of engagement What else have you learned by walking around?

  33. Teacher Channel Videos https://www.teachingchannel.org.

  34. Practice: Observing: What do you see? What is the teacher doing? What are the students doing?

  35. What do you wonder?What questions would you have for the teacher?What questions would you have for the students?

  36. Goal setting walk through

  37. Gather Data before you set the goal!

  38. Goal: Improve writing achievement by 10% Define the problem of practice: Is it a lack of instruction in writing? Is the lack of instruction in mechanics, process, or both? Is it a lack of practice opportunities to write across the content areas? Is it a lack of vocabulary? Are the vocabulary deficits in oral or written language?

  39. Suppose your students writing skills were weak, and you then observed instruction… “Based on my observation of 15 classrooms (visited once a week for three weeks), we need to improve the quality of interaction between faculty and students. We need to model and require academic language in both our oral and written communication.”

  40. “If we model and then scaffold our students ability to use academic language, we will see an improvement in our students reading, writing, and speaking proficiency.”

  41. Theory of Action

  42. Implementation Walk-Through

  43. Implementation Walk When do you want to see this change in practice? How frequently will you gather implementation data? Who will gather the data? How will that refine your theory of action? Grade level, whole school, individual teacher?

  44. What if you see very little Implementation?

  45. Thou shalt… Only schedule professional development for faculty that you can measure in terms of a change in instructional practice Prepare a standard protocol for faculty and share it before observation Give sufficient time to implement and resources for job embedded coaching

  46. Be specific!

  47. Clarify: Message from observer: Last Friday we had an in-service day where faculty shared their strategies for teaching writing across content areas. I am anxious to see how you’re incorporating these practices in your classroom. I will be doing walk-throughs beginning next week to see how the students are responding to their new learning. Attached is a form I’ll be use to gather data so that we can discuss student work at our PLCs in November. I look forward to seeing you in action.

  48. Inspect what you expect Ask professional development providers (including staff) for an observation protocol “If you can’t see it in the classroom, it doesn’t exist.” Elmore