1 / 31

The Ultimate Question:

The Ultimate Question:. Does Your Technology Program Work? Elizabeth Byrom, Principal Investigator Anna Li, Evaluator. Objectives. Think about the context for evaluating technology programs Identify the key elements of an evaluation model

gerda
Télécharger la présentation

The Ultimate Question:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Ultimate Question: Does Your Technology Program Work? Elizabeth Byrom, Principal Investigator Anna Li, Evaluator

  2. Objectives • Think about the context for evaluating technology programs • Identify the key elements of an evaluation model • Walk through steps for developing an evaluation plan • Identify Evaluation Resources

  3. Why is evaluating technology Programs a challenge? • Differences among adopters • Scale effects • Geography • Media as systems • Rapid change • Trail of use (B. Bruce, 2001)

  4. Why is evaluation a challenge? • Re-creation of technology • New roles for teachers and students • Technical characteristics • Access (B. Bruce, 2001)

  5. Observations from SEIR*TEC • Evaluation is often the weakest part of a technology program. • Competing priorities • Expertise • Policymakers often have unrealistic expectations. • Traditional measures do not always apply.

  6. Some Things to Consider • It takes four or five years for most teachers to become highly proficient in teaching with technology • Effective use of technology usually requires changes in teaching strategies.

  7. Some Things to Consider • It’s the combined effect of good teaching, appropriate technologies, and conducive environment that makes a difference in student achievement. • Good technology does not make up for poor teaching.

  8. Professional Development Map Inputs Plans Needs Assessments Mandates & Policies Research & Best Practices

  9. Evaluation Questions • At least one question per objective • Questions on • Accountability • Quality • Impact • Sustainability • Lessons learned

  10. Indicators Outcomes Questions Methods Criteria

  11. Kinds of Questions • Accountability: Is the program doing what it is supposed to do? • Quality: How well are we implementing program activities and strategies? How good (useful, effective, well received) are products and services?

  12. Kinds of Questions • Impact: Is the program making a difference? What effects are services and products having on target populations/ • Proximal effects • Distal effects

  13. Kinds of Questions • Sustainability – What elements are, or need to be, in place for sustained level of improvement in teaching and learning with technology to occur? • Lessons learned – What lessons are we learning about the processes and factors that support or inhibit the accomplishment of objectives?

  14. Sample Questions • To what extent are teachers using technology to increase the depth of student understanding and engagement? • How have students been impacted by technology integration? • How effective has our professional development been in helping teachers attain basic technology proficiency? In helping them learn effective teaching practices?

  15. Indicators • Definition: a statement of what you would expect to find out or see that demonstrates particular attributes. • Focus on: • Quality, effectiveness, efficacy, usefulness, client satisfaction, impact

  16. Information Sources • Self-reports • Questionnaires • Interviews • Journals and anecdotal accounts • Products from participants • Sample of work, tests, and portfolios • Observations • Media: videotape, audiotape, photographs • Archives

  17. Sources of Information – Tracking Tools • Milken Exchange Framework • CEO Forum STaR Chart • Learning with Technology Profile Tool (NCRTEC) • SEIR*TEC – Technology Integration Gauge for Success • Profiler (HPRTEC)

  18. profiler.hprtec.org

  19. Methods/strategies for collecting data • Questionnaire • Survey • Interview • Focus group • Observation • Archival records

  20. Criteria and Benchmark • Stick a stake in the ground and say “we are here today” • Likert-type Scale or rubrics • STaR Chart • SEIR*TEC Progress Gauge • Percentages: e.g. 75% passing rate

  21. Outcomes • Decisions are made about maintaining, changing, or eliminating aspects of the program • Convincing evidence is gathered for proposals and plans • Products developed and distributed • Reports • Plans

  22. Data Analysis • Quantitative Data • Use Excel Spreadsheet • SPSS for Windows or Mac • Qualitative Data • Content analysis • Look for Emerging Themes and Summarize

  23. Using Evaluation Results • Make data-informed decisions • Make a case for continued funding • Inform research and the public

  24. Evaluation Data on Impact • SEIR*TEC Professional Development Models – listed by greatest impact: High Quality Met Needs Timely Important Resource Institutes 100% 100% 100% 100% Academies 100% 97.4% 97.4% 96.5% Core Groups 94.7% 86.2% 92.9% 96.3% Workshops 89% 84% 87.5% 85.9% Presentations 89.3% 78.8% 83.7% 79%

  25. Hints for Successful Evaluation • Think positively. Evaluation is an opportunity to learn • Try to be objective. • Make evaluation an integral part of the program • Involve stakeholders in the evaluation process • Brag on your successes, however small. • Ask for help when you need it.

  26. Recommended Books • King, Morris, & Fitz-Gibbon, How to assess program Implementation. Sage, 1987 • Patton, Utilization Focused Evaluation, Sage, 3d edition, 1996. • Patton, How to use Qualitative Methods in Evaluation, Sage, 1987. • Joint Committee on Standards for Educational Evaluation, The Program Evaluation Standards, 2nd Edition. Sage, 1994. • Campbell, & Stanley, Experimental and Quasi-Experimental Designs for Research, Houghton-Miflin, 1963. (CS)

  27. Evaluation Resources • SEIR*TEC Web Site: http://www.seirtec.org • US Department of Education: • http://www.ed.gov/pubs/EdTechGuide/ • http://www.ed.gov/Technology/TechConf/1999/whitepapers/paper8.html • Education Evaluation Primer: http://www.ed.gov/offices/OUS/eval/primer1.html

  28. Evaluation Resources • Muraskin, Understanding Evaluation: The way to Better Prevention Programs.http://www.ed.gov/PDFDocs/handbook.pdf • National Science Foundation, User-Friendly Handbook for Program Evaluation, www.ehr.nsf.gov/EHR/RED/EVAL/handbook/handbook.htm • W.K. Kellogg Foundation Evaluation Handbook, http://www.wkkf.org/Publications/evalhdbk/default.htm

  29. For Further Information, Contact • Elizabeth Byrom, Ed.D. Ebyrom@serve.org • Anna Li Ali@serve.org SouthEast Initiatives Regional Technology in Education Consortium 1-800-755-3277 www.seirtec.org

More Related