1 / 25

From Dept. of Ed website Dated http://www.ed.gov/offices/OERI/presentations/evidencebase.ppt - 277504 bytes - Fri Mar 1

From Dept. of Ed website Dated http://www.ed.gov/offices/OERI/presentations/evidencebase.ppt - 277504 bytes - Fri Mar 15 12:15:53 EST 2002 . Evidence-Based Education (EBE). Grover J. (Russ) Whitehurst Assistant Secretary Educational Research and Improvement

diandra
Télécharger la présentation

From Dept. of Ed website Dated http://www.ed.gov/offices/OERI/presentations/evidencebase.ppt - 277504 bytes - Fri Mar 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Dept. of Ed website Dated http://www.ed.gov/offices/OERI/presentations/evidencebase.ppt - 277504 bytes - Fri Mar 15 12:15:53 EST 2002

  2. Evidence-Based Education (EBE) Grover J. (Russ) Whitehurst Assistant Secretary Educational Research and Improvement United States Department of Education

  3. What is EBE? • The integration of professional wisdom with the best available empirical evidence in making decisions about how to deliver instruction

  4. What is professional wisdom? • The judgment that individuals acquire through experience • Increased professional wisdom is reflected in numerous ways, including the effective identification and incorporation of local circumstances into instruction

  5. What is empirical evidence? • Scientifically based research from fields such as psychology, sociology, economics, and neuroscience, and especially from research in educational settings • Objective measures of performance used to compare, evaluate, and monitor progress

  6. Why are both needed? • Without professional wisdom education cannot • adapt to local circumstances • operate intelligently in the many areas in which research evidence is absent or incomplete. • Without empirical evidence education cannot • resolve competing approaches • generate cumulative knowledge • avoid fad, fancy, and personal bias

  7. Scientifically Based Research • Quality • Relevance

  8. Quality: Levels of evidence All evidence is NOT created equal • Randomized trial • Quasi-experiment, including before & after • Correlational study with statistical controls • Correlational study w/o statistical controls • Case studies

  9. Randomized Trials: The gold standard • Claim about the effects of an educational intervention on outcomes • Two or more conditions that differ in levels of exposure to the educational intervention • Random assignment to conditions • Tests for differences in outcomes

  10. Why is randomization critical? • Assures that the participants being compared have the same characteristics across the conditions • Rules of chance mean that the smart, motivated, experienced, etc. have the same probability of being in condition 1 as in condition 2 • Without randomization, differences between two conditions may result from pre-existing difference in the participants, e.g., more smart ones in condition 1

  11. Why is randomization critical? Without randomization, simple associations such as between internet use and science grades have many different interpretations Average science scores by students' reports on use of the Internet at home

  12. Relevance • Does the study involve a similar intervention and outcome to those of interest? • Were the participants and settings representative of those of interest? • Were enough participants involved to justify generalization?

  13. EBE: How to use existing science • Search literature (ERIC, Campbell Collaboration, PsychInfo, etc.) • Screen literature • Relevance • Quality • Search for pre-digested evidence • Narrative reviews (ERIC digests) • Systematic reviews (meta-analysis)

  14. Screening research -- Cautions • Unconditional conclusions • Conclusions involving hypotheticals • Conclusions that diverge from evidence • Strong calls to action • Mixtures of opinions with evidence • Low prestige publication outlet • Publication outlet with ideological agenda

  15. Evidence will not make the decision • Be skeptical • Consider other ways of achieving goal • Consider consequences and local circumstances • Consult with experts who understand evidence before making costly decisions (This is different from consulting authorities who may know the subject area but not rules of evidence)

  16. EBE: How to use objective measures • Find or develop local data • measure and measure frequently • stagger the introduction of an innovation across sites • compare between and within • Search state and national databases for benchmarks • www.edtrust.org • www.greatschools.net • www.ses.standardandpoors.com • www.just4kids.org

  17. Example of using local evidence • Benchmarking using the www.edtrust.org website to identify high flying schools in your state with similar demographics

  18. EBE -- Where are we?

  19. What ED will do • The What Works Clearinghouse • interventions linked to evidentiary support • systematic reviews • standards for & providers of evaluations • Preschool Curriculum Evaluation Research • Well designed, timely, & nonpartisan evaluations of ED’s own programs • Funding for evaluations of promising innovations in the field • Build capacity

  20. Goals • ED will provide the tools, information, research, and training to support the development of evidence-based education • The practice of evidence-based education will become routine • Education across the nation will be continuously improved • Wide variation in performance across schools and classrooms will be eliminated

More Related