1 / 37

Measures of Effective Teaching (MET) project

June 20, 2011. Measures of Effective Teaching (MET) project. z. What does it take to build better feedback & evaluation systems? (and why start there, anyway?). teaching effectiveness. close the gap. “we can’t fire our way to Finland” - Linda darling hammond.

moya
Télécharger la présentation

Measures of Effective Teaching (MET) project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. June 20, 2011 Measures of Effective Teaching (MET)project

  2. z What does it take to build better feedback & evaluation systems? (and why start there, anyway?)

  3. teaching effectiveness close the gap

  4. “we can’t fire our way to Finland” - Linda darling hammond

  5. “but can we get to Finland through feedback?”

  6. information you can trust

  7. Teacher Evaluation Scores – Reading

  8. Trustworthiness Tests • 1. Face Validity • Do teachers recognize the observation instrument and other measures as reflecting qualities of practice they value? • 2. Coherence • Do the measures match your district’s theory of instruction? • 3. Predictive Validity • Do scores on your measures correlate with outcomes that you value, such as gains in student learning? • 4. Scoring Reliability • If a different rater had been assigned to the observation or the assessment, would the score be the same?

  9. our partners

  10. The Measures of Effective Teaching Project Participating Teachers • Two school years: 2009–10 and 2010–11 • >100,000 students • Grades 4–8: ELA and Math • High School: ELA I, Algebra I and Biology

  11. Research Partners • Our primary collaborator include: • Mark Atkinson, Teachscape • Nancy Caldwell, Westat • Ron Ferguson, Harvard University • Drew Gitomer, Educational Testing Service • Eric Hirsch, New Teacher Center • Dan McCaffrey, RAND • Roy Pea, Stanford University • Geoffrey Phelps, Educational Testing Service • Rob Ramsdell, Cambridge Education • Doug Staiger, Dartmouth College • Other key contributors include: • Joan Auchter, National Board for Professional Teaching Standards • Charlotte Danielson, The Danielson Group • Pam Grossman, Stanford University • Bridget Hamre, University of Virginia • Heather Hill, Harvard University • Sabrina Laine, American Institutes for Research • Catherine McClellan, Educational Testing Service • Denis Newman, Empirical Education • Raymond Pecheone, Stanford University • Robert Pianta, University of Virginia • Morgan Polikoff, University of Southern California • Steve Raudenbush, University of Chicago • John Winn, National Math and Science Initiative The MET ProjectThe Bill & Melinda Gates Foundation launched the Measures of Effective Teaching (MET) project in fall 2009 to test new approaches to measuring effective teaching. The project’s goal is to help build fair and reliable systems for teacher observation and feedback to help teachers improve and administrators make better personnel decisions. With funding from the foundation, the data collection and analysis are being led by researchers from academic institutions, nonprofit organizations, and several private firms and are being carried out in seven urban school districts.

  12. our measures

  13. Multiple Measures of Teaching Effectiveness

  14. The MET Project research How well can … • Student Perception Surveys • Structured Classroom Observations + • Pedagogical Content Knowledge Test + • Student Outcomes + … produce valid & reliable measures of teaching?

  15. We asked students.

  16. Classroom Observation Using Digital Video

  17. Knowledge for Teaching

  18. Value-added on each test • Value-added on state assessment • Value-added on supplemental assessments

  19. helping states and districts

  20. Validation Engine • Software provides analysis of: • Rater consistency • Rubric’s relation to student learning Raters score MET videos of instruction System picks observation rubric & trains raters

  21. Ensuring Reliable Observations Rater Training & Consistency: building trust

  22. Facilitating Feedback Direct Feedback Real time (ear-bud) coaching Remote Collaboration

  23. Preliminary Finding #1 the kids know

  24. Student Perceptions Captivate Consolidate Challenge Confer Control Clarify Care TestPrep Care • My teacher makes me feel that s/he really cares about me • My teacher seems to know if something is bothering me • My teacher really tries to understand how students feel about things • My teacher takes the time to summarize what we learn each day • The comments that I get on my work in this class help me understand how to improve • If you don’t understand something , my teacher explains it a different way. • My teacher knows when the class understands, and when we do not. • My teacher has several good ways to explain each topic that we cover in the class. • My teacher asks students to explain more about the answers they give. • My teacher doesn’t let people give up when the work gets hard. • In this class, we learn to correct our mistakes. • Students in this class treat the teacher with respect • My classmates behave the way the teacher wants them to • Our class stays busy and doesn’t waste time • My teacher makes learning enjoyable • My teacher makes learning interesting • I like the way we learn in this class • My teacher wants us to share our thoughts • Students get to decide how activities are done in this class • I have learned a lot this year about [the state test] • Getting ready for [the state ] test takes a lot of time in our class Control Clarify Challenge Captivate Confer Consolidate TestPrep

  25. Students Distinguish Between TeachersPercentage of Students by Classroom Agreeing

  26. Student Perceptions • Top 5 Correlations Category Rank Survey Statement • Students in this class treat the teacher with respect Control 1 • My classmates behave the way my teacher wants them to Control 2 3 • Our class stays busy and doesn’t waste time Control 4 • In this class, we learn a lot every day Challenge 5 • In this class, we learn to correct our mistakes Challenge • I have learned a lot this year about [the state test] Test Prep 33 Test Prep 34 • Getting ready for [the state test] takes a lot of time in our class

  27. Preliminary Finding #2 the consequences are great

  28. Teacher Impact: Real or Random?

  29. Students with Most Effective Teachers Learn More in School

  30. still to come…

  31. Video example – coming tomorrow!

  32. Video validation Developer Instrument University of Virginia • Classroom Assessment Scoring System, CLASS • Mathematical Quality of Instruction (MQI) University of Michigan • Framework for Teaching Charlotte Danielson • Quality Science Teaching (QST) Stanford University • Protocol for Language Arts Teaching Observation (PLAT0) Pam Grossman • National Board for Professional Teaching Standards NBPTS Natl Math & Sci Initiative • UTeach Observation Protocol (UTOP)

  33. MET Logical Sequence Measures reliable Measures predict Research Use Measures stable under pressure Measures combine ? Effective Teaching Index Measures fairly reflect teacher Teaching Effectiveness Dashboard Measures improve effectiveness

  34. www.metproject.org info@metproject.org

More Related