1 / 8

Between a Rock and a Hard Place

Between a Rock and a Hard Place. Scientifically Based Evidence in Evaluation Presentation to the Canadian Evaluation Society Vancouver, B.C. June 3, 2003. What Works Clearinghouse. A new U.S. federal initiative

arlenep
Télécharger la présentation

Between a Rock and a Hard Place

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Between a Rock and a Hard Place Scientifically Based Evidence in Evaluation Presentation to the Canadian Evaluation Society Vancouver, B.C. June 3, 2003

  2. What Works Clearinghouse • A new U.S. federal initiative • To be an ongoing public resource that will assess and report scientific evidence on “What Works?" in education • Systematic review processes will be used to report on the quantity, quality, and relevance of evidence, and magnitude of effects of specific educational interventions

  3. What is empirical evidence? • Scientifically-based research from fields such as psychology, sociology, speech & hearing, economics, and neuroscience, and especially from research in educational settings • Empirical data on performance used to compare, evaluate, and monitor progress

  4. Evidence-Based Education

  5. EBE – Where Are We?

  6. Quality: Levels of Evidence All evidence is NOT equal for questions of effectiveness (what works) • Randomized trial (true experiment) • Comparison groups (quasi-experiment) • Pre-/Post- comparison • Corelational studies • Case studies • Anecdotes

  7. What might this mean for evaluators? • Evaluation questions reduced to what works questions. • Evaluation costs redirected from complex studies of multiple variables to RCT’s • Evaluators’ bookshelves filling up with research design texts and pushing evaluation texts aside • Logic models falling into disuse • Evaluations tapped for inclusion in reviews/syntheses often not complete • Client requests for RCT evaluations

  8. Randomized Trials: The gold standard • Claims about the effects of an educational intervention on outcomes • Two or more conditions that differ in levels of exposure to the educational intervention • Random assignment to conditions • Tests for differences in outcomes

More Related