1 / 17

Evaluation for Instructional Technology SAP 131

Evaluation for Instructional Technology SAP 131. Jeff Sun – Sun Associates. jsun@sun-associates.com www.sun-associates.com Program evaluation is what we do The process we’re showing today is applied to projects great and small Statewide initiatives to individual school/districts projects

emilie
Télécharger la présentation

Evaluation for Instructional Technology SAP 131

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation for Instructional TechnologySAP 131 Jeff Sun – Sun Associates

  2. jsun@sun-associates.com • www.sun-associates.com • Program evaluation is what we do • The process we’re showing today is applied to projects great and small • Statewide initiatives to individual school/districts projects • Technology Planning • Technology Review and Assessment • Technology Projects • TAH • STEM

  3. Today’s Organization • Overview a process for effective program evaluation: • What are we hoping to accomplish? (clarifying goals) • What will success look like? (creating indicators) • What does our data say? (looking at progress) • What can we do to improve? (making recommendations and reporting) • Tools and Techniques for Data Collection • Q & A and Discussion

  4. To Recap the Key Points • Evaluation is based in project/initiative goals • What are we hoping to accomplish? • Each goal connects to an indicator that describes successful accomplishment of the goal • Accomplishment comes from performing activities effectively • What will success look like? • Data collection is the process of gathering information from project participants about the impact of activities • What does our data say? • Analysis of the data findings – against the indicators – allows us to see accomplishments and adjust our work to increase/improve accomplishments • What can we do to improve?/ What did we accomplish? • Reporting happens continually • It’s formative if the project continues once recommendations are made • It’s summative if the project is over once reporting occurs

  5. A Sample Project • To see how this works practically, let’s examine a sample project…

  6. What are we hoping to accomplish? • Project Goals • Supported by Activities

  7. What will success look like? • Indicators – one for each goal • What will it look like when the project has done the activities that relate to fulfilling the goals?

  8. What does the data say? • Data is collected in response to what we’re looking for … as is described in our indicators

  9. Data Collection The project has created a wide range of tools that are available to project participants. These tools fit well within the PCB curriculum and accommodate a variety of student learning styles. The tools allow students to explore a variety of what-if science scenarios and are effective at conveying concepts in science. The tools are supported with print material and other resources. • Broadly, we are asking…What can we discover about the tools that the project has created? • Specifically, we want to know… • What can we learn about the availability of the tools (to participants)? • What can we learn about the function of the tools…how well they fit the curriculum and accommodate learning styles? • What can we discover about how the tools are used by students? • What can we learn about how the tools are supported?

  10. Developing questions

  11. Different tools/methodologies are suited for different types of questions and audiences • Example tools at www.sun-associates.com/necc2008 • Focus groups and interviews • Tests • Surveys • Observations • Artifact Analysis

  12. Tool recap • Data collection involves using a variety of tools to answer a variety of questions – all connected to indicators • Which tool you use depends on opportunity… • What data collection opportunities are presented? • …and appropriateness of format • When is one format more appropriate than another? • Different tools tell you different things • You probably want both self-reported data (e.g,. a survey) as well as objective data (e.g,. an evaluator’s observation) • Both add together to paint the picture of how you are doing in terms of your indicators

  13. Findings and reporting • Findings are what you get when you compare your data to your indicators. • Findings lead to recommendations

  14. Additional tools and resources • www.sun-associates.com/necc2008 • All of the handouts for this workshop as well as links to other reading and materials • www.sun-associates.com/edtechevaluation • www.neirtec.org/evaluation

  15. Connections to your own work • In what ways can you imagine using this process in your own project work? • What would be some challenges you might encounter in attempting to do this? • jsun@sun-associates.com • www.sun-associates.com/necc2008

More Related