1 / 85

Design -Based Research for Advancing Educational Technology

Design -Based Research for Advancing Educational Technology. EDIT 9990. Goals. Critique the state- of-the-art of educational research. Describe applications of “design-based research.” Encourage new thinking about why and how we do research.

shamus
Télécharger la présentation

Design -Based Research for Advancing Educational Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design-Based Research for Advancing Educational Technology EDIT 9990

  2. Goals • Critique the state-of-the-art of educational research. • Describe applications of “design-based research.” • Encourage new thinking about why and how we do research.

  3. Improving the quality of teaching and learning through educational research is critical to our survival.

  4. Global Warming

  5. Dutch Floating Homes

  6. The lack of scientific literacy is appalling in even the most developed countries.

  7. Bad News Oh...no! Most educational research has little impact on practitioners and yields few discernable benefits.

  8. The Failure of Educational Research • Vast resources going into education research are wasted. • They [educational researchers] employ weak research methods, write turgid prose, and issue contradictory findings.

  9. The Failure of Educational Research • Too much useless work is done under the banner of qualitative research. • Qualitative research…. [yields] ….little that can be generalized beyond the classrooms in which it is conducted.

  10. College of EducationThe University of Georgia • Ranked 27th of 187 education colleges in the USA • 240 faculty members in 9 departments • 5,000 students in 33,000+ student university

  11. Research Productivity 1997-2001Refereed Journal Articles (in-cites.com) • U. of Wisconsin - 202 • U. of Georgia - 201 • U. of Michigan - 164 • Indiana U. - 161 • U. of Maryland - 146

  12. $7,824 44,073 51% 49th $8,604 42,232 78% 7th Georgia vs. Wisconsin • Per pupil • Salary • HS Grad. • Ranking

  13. It is time we put the PUBLIC back in publication!

  14. Bush Administration Position • “There’s been no improvement in education over the last 30 years, despite a 90 percent increase in real public spending per pupil.” • Promotes randomized controlled trials as used in medical research.

  15. Four Reform Principles Accountability: Guaranteeing Results Flexibility: Local Control for Local Challenges Research-Based Reforms: Proven Methods with Proven Results Parental Options: Choices for Parents, Hope for Kids

  16. What Works Clearinghouse The What Works Clearinghouse (WWC) has been established by the U.S. Department of Education’s Institute of Education Sciences to provide educators, policymakers, researchers, and the public with a central and trusted source of scientific evidence of what works in education.

  17. Slavin’s 5 Questions for Valid Educational Research • Is there a control group? • Are the control and experimental groups assigned randomly? • If a matched study, are the groups extremely similar? • Is the sample size large enough? • Are the results statistically significant? Robert Slavin

  18. “What Works” Position • “Once we have dozens or hundreds of randomized or carefully matched experiments going on each year on all aspects of educational practice, we will begin to make steady, irreversible progress.” • NCLB funds “scientifically based research.” Robert Slavin

  19. “It Won’t Work” Position • Double blind experiments impossible in education • Implementation variance reduces treatment differences • Causal agents are under-specified in education • Goals, beliefs, and intentions of students and teacher affect treatments David R. Olson

  20. Medical and health knowledge is rarely applied sufficiently.

  21. Another “It Won’t Work” Position • The What Works Clearinghouse (WWC) standards “ignore the critical realities about social, organizational, and policy environments in which educational programs and interventions reside.” • Advocates “decision-oriented” evaluation research over “conclusion-oriented” academic research. • Recommends extended-term mixed-method (ETMM) designs as a viable alternative. Madhabi Chatterji

  22. American Evaluation Association The priority given to randomized controlled trials “manifests fundamental misunderstandings about 1) the types of studies capable of determining causality, 2) the methods capable of achieving scientific rigor, and 3) the types of studies that support policy and program decisions. We would like to help avoid the political, ethical, and financial disaster that could well attend implementation of the proposed priority.”

  23. Randomized controlled trials are the only way we’ll ever be able to prove “what works” in education! Randomized controlled trials promotes pseudoscience and will limit effective change!

  24. Educational researchers have failed to make a clear appeal to the public for their support. People learn when…..

  25. Ellen Lageman argues that educational researchers, in a misguided effort to be “scientific,” have turned away from the pragmatic vision of John Dewey. • She attacks the excessive emphasis on quantitative measurement.

  26. Kieran Egan argues that progressive ideas from Herbert Spencer, John Dewey, and Jean Piaget are responsible for the “general ineffectiveness” of our schools. • He also assails the notion that education can be improved through research as traditionally conceived.

  27. Thomas Kuhn The Structure of Scientific Revolutions "I'm not sure that there can now be such a thing as really productive educational research. It is not clear that one yet has the conceptual research categories, research tools, and properly selected problems that will lead to increased understanding of the educational process. There is a general assumption that if you've got a big problem, the way to solve it is by the application of science. All you have to do is call on the right people and put enough money in and in a matter of a few years, you will have it. But it doesn't work that way, and it never will."

  28. Complexity of Interactions • We cannot store up generalizations and constructs for ultimate assembly into a network. • When we give proper weight to local conditions, any generalization is a working hypothesis, not a conclusion. Lee Cronbach

  29. Learning Styles “Research into learning styles can, in the main, be characterised as small-scale, non-cumulative, uncritical and inward-looking. It has been carried out largely by cognitive and educational psychologists, and by researchers in business schools and has not benefited from much interdisciplinary research.”

  30. convergers versus divergers verbalisers versus imagers holists versus serialists deep versus surface learning activists versus reflectors pragmatists versus theorists adaptors versus innovators assimilators versus explorers field dependent versus field independent globalists versus analysts assimilators versus accommodators imaginative versus analytic learners non-committers versus plungers common-sense versus dynamic learners concrete versus abstract learners random versus sequential learners initiators versus reasoners intuitionists versus analysts extroverts versus introverts sensing versus intuition thinking versus feeling judging versus perceiving left brainers versus right brainers meaning-directed versus undirected theorists versus humanitarians activists versus theorists pragmatists versus reflectors organisers versus innovators lefts/analytics/inductives/successive processors versus rights/globals/deductives/ simultaneous processors executive, hierarchic, conservative versus legislative, anarchic, liberal. Dichotomies

  31. If Sisyphus were a scholar, his field would be educational research.- David Laberee

  32. Educational Technology Research

  33. Pseudoscience Results Insert cone of experience example

  34. Pseudoscience Results Insert cone of experience example

  35. Educational technology researchers are not doing much better than other educational researchers.

  36. NCLB Requirements • "every student is technologically literate by the time the student finishes the eighth grade," and • "that technology will be fully integrated into the curricula and instruction of the schools by December 31, 2006."

  37. Abundant technology has not led to extensive use of computers for “tradition-altering classroom instruction.” • The small percentage of computer-using instructors only use it to maintain existing classroom practices.

  38. Teachers have legitimate concerns. • Is it simple enough for me to learn quickly? • It it versatile? • Will it motivate students? • Is it aligned with skills I’m expected to teach. • Is it reliable? • It it breaks, who will help? • Will it weaken my classroom authority?

  39. Ed. Tech Research Reality • Isolated researchers conduct individual studies rarely linked to a research agenda or concerned with any relationship to practice. • Studies are presented at conferences attended by other researchers and published in journals few people read. • Occasional literature reviews and meta-analyses are published.

  40. Ed. Tech Research Reality • Many educational technology studies claim to have predictive goals (testing theories) and use quasi-experimental designs with quantitative measures. • Research reviewers usually must reject 75 percent or more of the published studies to find the few worthy of further review or inclusion in meta-analyses.

  41. Ed. Tech Research Reality • Dillon & Gabbard’s 1998 literature review of “Hypermedia as an Educational Technology” highlights problems with IT research. • Major conclusion: “Clearly, the benefits gained from the use of hypermedia technology in learning scenarios appear to be very limited and not in keeping with the generally euphoric reaction to this technology in the professional arena.”

  42. Ed. Tech Research Reality • Fabos & Young 1999 literature review of “Telecommunications in the Classroom: Rhetoric Versus Reality” is another bad sign. • Major conclusion: “…many of the expected benefits of telecommunications [enhancing writing, multicultural awareness, and economic possibilities] are inconclusive, optimistic, and even contradictory.”

  43. Bernard et al. (2004) Meta-analysis: “How Does Distance Education Compare to Classroom Instruction?” • a very small but positive mean effect size for interactive distance education over traditional classroom instruction on student achievement • small negative effect for retention rate

  44. DE Research from 1985-2002 • 1,010potential “studies” retrieved • 232studies met all criteria • 599independent effect sizes • 47,341students (achievement)

  45. Results: Overall Effects • 325independent outcomes (total achievement) • Hedges’g =+0.0122, p < .001 • Range of findings from–2.17to+2.66 • 177 outcomes with low methodology removed • Hedges’g =+0.017, p > .05 • Significantly heterogeneous

  46. DistributionofEffectSizes Hedges’g Effect Sizes Ordered by Magnitude • 325 independent outcomes (achievement) • Hedges’ g =+0.0122, p < .001

  47. Sir John Daniel - UNESCO … the futile tradition of comparing test performances of students using new learning technologies with those who study in more conventional ways…is a pointless endeavor because any teaching and learning system, old or new, is a complex reality. Comparing the impact of changes to small parts of the system is unlikely to reveal much effect and indeed, “no significant difference” is the usual result of such research.

More Related