1 / 14

Methods and Tools for Measuring Fidelity

Methods and Tools for Measuring Fidelity. Greg Roberts, PhD. Vaughn Gross Center & National Center for Instruction The University of Texas at Austin. Notes on Fidelity. Fidelity is the match between intended and actual Considerations for conceptualizing fidelity

robertsona
Télécharger la présentation

Methods and Tools for Measuring Fidelity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methods and Tools for Measuring Fidelity Greg Roberts, PhD. Vaughn Gross Center & National Center for Instruction The University of Texas at Austin

  2. Notes on Fidelity • Fidelity is the match between intended and actual • Considerations for conceptualizing fidelity • Multilevel nature of many interventions • Person and group • Outcome and process • Level and intensity of measurement aligned with need and with likely quality of data • Alignment with desired outcomes

  3. More Notes on Fidelity • Capacity for monitoring fidelity • Burden of monitoring fidelity • Tools for monitoring fidelity • Labor and cost intensity…for what? • Prospective versus retrospective purpose

  4. Fidelity-related Tools • Teacher self reports • Teacher logs • Observation protocols • Little evidence supporting their equivalence (see for example, Burstein, McDonnell, Van Winkle, Ormseth, Mirocha & Guiton, 1995; Porter, Kirst, Osthoff, Smithson, & Schneider, 1993)

  5. Teacher Self Reports • Methods • Survey • Focus group • Interview • Relatively cost efficient • May be externally valid

  6. Teacher Self Reports • Response bias • Recall • Social desirability • Limited range of sampled behaviors • Suspect predictive validity

  7. Teacher Self Reports • Research on enacted curriculum • Federal efforts to increase utility • Item type • Item content • Response format • Informal efforts

  8. Teacher Logs • Standardized materials • Initial and ongoing training re: use • Prospective analysis plan • Loads of data • Data entry and management • Analysis

  9. Teacher Logs • Study of Instructional Improvement (Camburn & Burns, 2003) • Evidence on the validity of these logs in represented enacted curricula. • Literacy instruction (Rowan, 2004) • varies day-to-day • varies across teachers (greatly) • differs (predictably) across educational reform programs (e.g., Success For All, America’s Choice)

  10. Classroom Observations • External to teacher • Eliminates response bias • Greater breadth of sampled behavior • Rater reliability • Training • Monitoring • Drift • Less external validity • High inference, low inference, descriptive

  11. Classroom Observations • Center for the Improvement of Early Reading Achievement Observation System (Taylor et al., 2003). • focus on teacher pedagogy, • student learning processes, and the • implementation of evidence-based instructional practices

  12. Classroom Observations • Five-minute observation cycles • Record uses narratives and specific codes: • who is teaching • grouping (e.g., whole class, small group, pairs, individual) • reading/language arts activities (e.g., reading, writing, etc.) • focus of the instruction (e.g., comprehension, phonics, etc.) • material used (e.g., text books, video, computers, board/chart) • teacher interaction (e.g., telling, modeling, discussion, coaching/scaffolding) • expected student response (e.g.., reading, reading turn taking)

  13. Classroom Observations • Instructional Content Emphasis (Edmonds & Briggs, 2004) • Substantive and applied research and evaluation • Teach for Success Classroom Observation Protocol (WESTED) • More oriented towards formative • School Observation Measure (Center for Research in Educational Policy) • School-wide measure

  14. Questions?

More Related