1 / 39

How Do We Know It’s Working?

How Do We Know It’s Working?. Creating Evaluations for Technology Projects and Evaluations (part I). Contact Information. jsun@sun-associates.com 978-251-1600 ext. 204 www.edtechevaluation.com This presentation will be linked to that site (on the Tools page). Where Do We Stand?.

thelmad
Télécharger la présentation

How Do We Know It’s Working?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

  2. Contact Information • jsun@sun-associates.com • 978-251-1600 ext. 204 • www.edtechevaluation.com • This presentation will be linked to that site (on the Tools page)

  3. Where Do We Stand? • Who’s working on an actual project? • Current? • Anticipated? • Your expectations for today

  4. Workshop Goals • To review the key elements of effective program evaluation as applied to technology evaluations • To consider evaluation in the context of your actual projects

  5. Why Evaluate? • To fulfill program requirements • NCLB and hence Title IID carry evaluation requirements • To realize your investment in technology • What sort of “difference” has all of this technology made?

  6. Basis in NCLB “The application shall include:… A description of the process and accountability measures that the applicant will use to evaluate the extent to which activities funded under this subpart are effective in integrating technology into curricula and instruction, increasing the ability of teachers to teach, and enabling students to meet challenging State academic content and student academic achievement standards.” NCLB Act, Title II, Part D, Section 2414(11)

  7. One consistent thread in NCLB is evaluation and assessment • How can you document that this “intervention” is making a difference? • All funded work must be based in reflection and data-driven decision-making • Naturally, this translates to local district proposals

  8. A Framework for Review

  9. Evaluation • Helps clarify project goals, processes, products • Must be tied to indicators of success written for your project’s goals • Not a “test” or checklist of completed activities • Qualitatively, are you achieving your goals? • What adjustments to can be made to your project to realize greater success?

  10. The Basic Process • Evaluation Questions • Tied to original project goals • Performance Rubrics • Allow for authentic, qualitative, and holistic evaluation • Data Collection • Tied to indicators in the rubrics • Scoring and Reporting • Role of this committee (the evaluation committee)

  11. Who Evaluates? • Committee of stakeholders (pg 12) • Outside facilitator? • Data collection specialists? • Task checklist • Other issues: • Honesty • Perspective • Time-intensive

  12. Evaluation Starts with Goals • Evaluation should be rooted in your goals for how you are going to use or integrate that technology • Is more than an infrastructure plan • Focuses on technology’s impact on teachers and students • Has clear goals and objectives for what you want to see happen

  13. Evaluation Logic Map

  14. Project Sample

  15. Your Project? • Using the Evaluation Logic Map, map your: • Project purpose/vision • Goals • Objectives • Actions

  16. Goals Lead to Questions • What do you want to see happen? • These are your goals • Rephrase goals into questions • Achieving these goals requires a process that can be measured through a formative evaluation

  17. We Start with Goals… • To improve student achievement through their participation in authentic and meaningful science learning experiences. • To provide advanced science and technology learning opportunities to all students regardless of learning styles or abilities. • To produce high quality science and technology curriculum in which the integration of technology provides “added value” to teaching and learning activities. • To increase students’ knowledge of the Connecticut River’s history and geology, and to gain and understanding its past, present and possible future environmental issues.

  18. …and move to questions • Has the project developed technology-enhanced science learning experiences that have been instrumental in improving student mastery of the Skills of Inquiry, understanding of the history/geology/ecology of the Connecticut River, and of the 5-8 science curriculum in general? • Has the project offered teacher professional development that has resulted in improved teacher understanding of universal design principles and technology integration strategies?

  19. …And Then to Indicators • What is it that you want to measure? • Whether the projects have enhanced learning • The relationship between the units and • The selected curriculum • The process by which they were developed • Increases in teacher technology skills (in relation to particular standards) • Whether the professional development model met with its design expectations • Collaborative and sustainable • Involves multiple subjects and administrators

  20. Indicators should reflect your project’s unique goals and aspirations • Rooted in proposed work • Indicators must be indicative of your unique environment...what constitutes success for you might not for someone else • Indicators need to be highly descriptive and can include both qualitative and quantitative measures

  21. Try a Sample Indicator • Going back to the Logic Map, try to develop a few indicators for your sample project • Keep it simple • Qualitative and quantitative • Will you be able to see the indicator?

  22. To Summarize... • Start with your proposal or technology plan • From your goals, develop indicators and a performance rubric

  23. Coming in Part II • Data Collection • Reporting

  24. How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part II)

  25. A Basic Process • Evaluation Questions • Must be tied to original planning goals • Performance Rubrics • Allow for authentic, qualitative, and holistic evaluation • Data Collection • Tied to indicators in the rubrics • Scoring and Reporting

  26. Measures? • Classroom observation, interviews, and work-product review • What are teachers doing on a day-to-day basis to address student needs? • Focus groups and surveys • Measuring teacher satisfaction • Triangulation with data from administrators and staff • Do other groups confirm that teachers are being served?

  27. Data Collection • Review Existing Data • Current technology plan • Curriculum • District/school improvement plans • www.sun-associates.com/eval/sample • Create a checklist for data collection

  28. Surveys • Creating good surveys • length • differentiation (teachers, staff, parents, community, etc..) • quantitative data • attitudinal data • timing/response rates (getting returns!) • www.sun-associates.com/eval/samples/samplesurv.html

  29. Surveys • Online • Profiler • LoTi • Zoomerang

  30. Survey Issues • Online surveys produce high response rates • Easy to report and analyze data • Potential for abuse • Depends on access to connectivity

  31. Focus Groups/Interviews • Focus Groups/Interviews • Teachers • Parents • Students • Administrators • Other stakeholders

  32. Classroom Observations • Using an observation template • Using outside observers

  33. Other Data Elements? • Artifact analysis • A rubric for analyzing teacher and student work? • Solicitation of teacher/parent/student stories • This is a way to gather truly qualitative data • What does the community say about the use and impact of technology?

  34. Dissemination • Compile the report • Determine how to share the report • School committee presentation • Press releases • Community meetings

  35. Conclusion • Build evaluation into your technology planning effort • Remember, not all evaluation is quantitative • You cannot evaluate what you are not looking for, so it’s important to — • Develop expectations of what constitutes good technology integration

  36. More Information • jsun@sun-associates.com • 978-251-1600 ext. 204 • www.sun-associates.com/evaluation • www.edtechevaluation.com • This presentation is linked to that page

More Related