310 likes | 438 Vues
This guide explores effective methods for evaluating technology programs in educational contexts. It emphasizes the importance of understanding key elements in evaluation, the challenges faced, and the necessary conditions for successful outcomes. By identifying evaluation questions, indicators, and data collection methods, educators can design effective evaluation plans that provide insights into program quality, impact, and sustainability. The aim is to enable practitioners to make informed decisions that enhance teaching and learning with technology.
E N D
The Ultimate Question: Does Your Technology Program Work? Elizabeth Byrom, Principal Investigator Anna Li, Evaluator
Objectives • Think about the context for evaluating technology programs • Identify the key elements of an evaluation model • Walk through steps for developing an evaluation plan • Identify Evaluation Resources
Why is evaluating technology Programs a challenge? • Differences among adopters • Scale effects • Geography • Media as systems • Rapid change • Trail of use (B. Bruce, 2001)
Why is evaluation a challenge? • Re-creation of technology • New roles for teachers and students • Technical characteristics • Access (B. Bruce, 2001)
Observations from SEIR*TEC • Evaluation is often the weakest part of a technology program. • Competing priorities • Expertise • Policymakers often have unrealistic expectations. • Traditional measures do not always apply.
Some Things to Consider • It takes four or five years for most teachers to become highly proficient in teaching with technology • Effective use of technology usually requires changes in teaching strategies.
Some Things to Consider • It’s the combined effect of good teaching, appropriate technologies, and conducive environment that makes a difference in student achievement. • Good technology does not make up for poor teaching.
Professional Development Map Inputs Plans Needs Assessments Mandates & Policies Research & Best Practices
Evaluation Questions • At least one question per objective • Questions on • Accountability • Quality • Impact • Sustainability • Lessons learned
Indicators Outcomes Questions Methods Criteria
Kinds of Questions • Accountability: Is the program doing what it is supposed to do? • Quality: How well are we implementing program activities and strategies? How good (useful, effective, well received) are products and services?
Kinds of Questions • Impact: Is the program making a difference? What effects are services and products having on target populations/ • Proximal effects • Distal effects
Kinds of Questions • Sustainability – What elements are, or need to be, in place for sustained level of improvement in teaching and learning with technology to occur? • Lessons learned – What lessons are we learning about the processes and factors that support or inhibit the accomplishment of objectives?
Sample Questions • To what extent are teachers using technology to increase the depth of student understanding and engagement? • How have students been impacted by technology integration? • How effective has our professional development been in helping teachers attain basic technology proficiency? In helping them learn effective teaching practices?
Indicators • Definition: a statement of what you would expect to find out or see that demonstrates particular attributes. • Focus on: • Quality, effectiveness, efficacy, usefulness, client satisfaction, impact
Information Sources • Self-reports • Questionnaires • Interviews • Journals and anecdotal accounts • Products from participants • Sample of work, tests, and portfolios • Observations • Media: videotape, audiotape, photographs • Archives
Sources of Information – Tracking Tools • Milken Exchange Framework • CEO Forum STaR Chart • Learning with Technology Profile Tool (NCRTEC) • SEIR*TEC – Technology Integration Gauge for Success • Profiler (HPRTEC)
Methods/strategies for collecting data • Questionnaire • Survey • Interview • Focus group • Observation • Archival records
Criteria and Benchmark • Stick a stake in the ground and say “we are here today” • Likert-type Scale or rubrics • STaR Chart • SEIR*TEC Progress Gauge • Percentages: e.g. 75% passing rate
Outcomes • Decisions are made about maintaining, changing, or eliminating aspects of the program • Convincing evidence is gathered for proposals and plans • Products developed and distributed • Reports • Plans
Data Analysis • Quantitative Data • Use Excel Spreadsheet • SPSS for Windows or Mac • Qualitative Data • Content analysis • Look for Emerging Themes and Summarize
Using Evaluation Results • Make data-informed decisions • Make a case for continued funding • Inform research and the public
Evaluation Data on Impact • SEIR*TEC Professional Development Models – listed by greatest impact: High Quality Met Needs Timely Important Resource Institutes 100% 100% 100% 100% Academies 100% 97.4% 97.4% 96.5% Core Groups 94.7% 86.2% 92.9% 96.3% Workshops 89% 84% 87.5% 85.9% Presentations 89.3% 78.8% 83.7% 79%
Hints for Successful Evaluation • Think positively. Evaluation is an opportunity to learn • Try to be objective. • Make evaluation an integral part of the program • Involve stakeholders in the evaluation process • Brag on your successes, however small. • Ask for help when you need it.
Recommended Books • King, Morris, & Fitz-Gibbon, How to assess program Implementation. Sage, 1987 • Patton, Utilization Focused Evaluation, Sage, 3d edition, 1996. • Patton, How to use Qualitative Methods in Evaluation, Sage, 1987. • Joint Committee on Standards for Educational Evaluation, The Program Evaluation Standards, 2nd Edition. Sage, 1994. • Campbell, & Stanley, Experimental and Quasi-Experimental Designs for Research, Houghton-Miflin, 1963. (CS)
Evaluation Resources • SEIR*TEC Web Site: http://www.seirtec.org • US Department of Education: • http://www.ed.gov/pubs/EdTechGuide/ • http://www.ed.gov/Technology/TechConf/1999/whitepapers/paper8.html • Education Evaluation Primer: http://www.ed.gov/offices/OUS/eval/primer1.html
Evaluation Resources • Muraskin, Understanding Evaluation: The way to Better Prevention Programs.http://www.ed.gov/PDFDocs/handbook.pdf • National Science Foundation, User-Friendly Handbook for Program Evaluation, www.ehr.nsf.gov/EHR/RED/EVAL/handbook/handbook.htm • W.K. Kellogg Foundation Evaluation Handbook, http://www.wkkf.org/Publications/evalhdbk/default.htm
For Further Information, Contact • Elizabeth Byrom, Ed.D. Ebyrom@serve.org • Anna Li Ali@serve.org SouthEast Initiatives Regional Technology in Education Consortium 1-800-755-3277 www.seirtec.org