1 / 25

Assessing Learning Objects : Do they Make Learning and Teaching Better, Faster, Cheaper?

Assessing Learning Objects : Do they Make Learning and Teaching Better, Faster, Cheaper?. Dean of Planning, Research, Assessment Connecticut Distance Learning Consortium. Diane J. Goldsmith. Carol Twigg’s Critique.

jase
Télécharger la présentation

Assessing Learning Objects : Do they Make Learning and Teaching Better, Faster, Cheaper?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing Learning Objects:Do they Make Learning and Teaching Better, Faster, Cheaper? Dean of Planning, Research, Assessment Connecticut Distance Learning Consortium Diane J. Goldsmith

  2. Carol Twigg’s Critique “MERLOT claims to have 7,000 or so learning objects in a database. But if these learning objects haven’t been evaluated in terms of whether or not they increase student learning, you then just have 7,000 sort of mildly interesting things collected in a database.” Carol Twigg in an interview http://www.educause.edu/pub/er/erm04/erm0443.asp?bhcp=1

  3. Assessment is Value DrivenAssessment Exists in a Context Assessment compares what exists with what ought to exist. Therefore you have to know what ought to exist – why are you creating learning objects?

  4. Principles of Assessment • AAHE’s 9 Principles of Good Practice for Assessing Student Learning. • Begins with Educational Values • Multi-dimensional, integrated, over time • Program to be improved has clear goals • Includes outcomes and experiences • Ongoing

  5. Principles of Assessment 9 Principles con’t: • Includes people across the institutions • Begins with issues of use and the questions people care about • Will lead to improvement if part of set of conditions that promote change • Meet our responsibilities to students and the public

  6. Assessment/Evaluation Model Utilization-Focused Evaluation (M.Q. Patton) • Looks at utilization (important in learning objects). • Based on 6 questions.

  7. “Utilization-Focused Evaluation (U-FE) begins with the premise that evaluations should be judged by their utility and actual use; therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration of how everything that is done, from beginning to end, will affect use.” UTILIZATION-FOCUSED EVALUATION CHECKLIST Michael Quinn Patton January 2002

  8. “Use concerns how real people in the real world apply evaluation findings and experience the evaluation process. Therefore, the focus in utilization-focused evaluation is on intended use by intended users. Since no evaluation can be value-free, utilization-focused evaluation answers the question of whose values will frame the evaluation by working with clearly identified, primary intended users who have responsibility to apply evaluation findings and implement recommendations.” UTILIZATION-FOCUSED EVALUATION CHECKLIST Michael Quinn Patton January 2002

  9. Basic Assessment Questions 1. Who wants learning objects? • IT folks • Instructional Designers • Faculty • Students • Administrators

  10. Basic Assessment Questions 2, What do they want them for? 3. By what criteria will they be judged?

  11. Basic Assessment Questions 4. How will you collect the necessary data as evidence of how well you are meeting the criteria you have established? • How much of this assessment can be embedded in the learning object? • Assessment requires staff time, money, resources,

  12. Basic Assessment Questions 5. When will you evaluate: • Formative (while you are in the building, testing stages) • Summative (after it is deployed) • Continuous

  13. Basic Assessment Questions 6. What will you do with the evidence? • Assessment shouldn’t be an end in itself. • Should lead to improvement/changes and more assessment

  14. Assessing Learning Issue of Causality • Experimental Design • random assignment, control groups, blind • Quasi-experimental – used in education • Non-equivalent groups – Post test only • Non-equivalent groups – Pre-Post test • Non-equivalent groups Time Series designs

  15. Some Threats to Causality • Hawthorn Effect • Newness • Small sample sizes • Reliability of Measures • Standard usage • History • Mortality • Selection

  16. Model – Peer Evaluation • Merlot: http://www.merlot.org Peer Reviews (1) avg: Member Comments (20) avg: Assignments (1) Collections (7) Example of Peer Review

  17. Models – User Evaluation University of Wisconsin Online Resource Center. Repository of Learning Objects “The goals of the project are to accelerate the development of quality online courses while, at the same time, minimizing the cost of course development by identifying and sharing best practices “

  18. Models – User Evaluation Bloom's Taxonomy For Cognitive Learning and Teaching Author:     Terri Langan School: Fox Valley Technical College    Date: 12/10/2002 Description: The users of this learning object read a brief introduction to the six levels of Bloom's Cognitive Taxonomy and quiz themselves on a basic understanding of the levels. View this objectRead reviews

  19. Model--Outcomes The Ct Distance Learning Consortium created a learning objects repository for the Ct. Voc-Tech schools: The searchable data base is at: http://www.ctdlc.org/votech/

  20. Model--Outcomes Subject:Math Core Topic: G. Calculating heights with Trignometry concepts Course:Geometry Abstract:This unit requires students to use Trigonometry concepts to calculate heights of building and objects. Students are expected to construct a clinometer and use it to calculate heights of objects. Objective:The unit objectives are aligned with the CT Framework, Math, Grades 9-12. Standard 6, Spatial Relationships and Geometry.

  21. Model – Cost Evaluation Center for Academic Transformation Program in Course Redesign – Carol Twigg - learning objects as a cost reduction strategy. Tool for calculating costs

  22. Model– Design Evaluation AliveTek Inc. • Learning Object Check List • Questions are mostly about instructional design • Navigation • learner interaction • Learner control of object • Contains assessment • Amount covered is appropriate

  23. Example • Wesleyan’s Learning Object Project Assessment • Multi-campus assessment • Based on a (non-equivalent groups post-test) • Student background • Student usage logs • Student performance on exams • Student and faculty surveys • Targeted focus groups

  24. Conclusion • Who is advocating for learning objects? • What “opposition” is there? • Why? What features of learning objects are most important to them? • What evidence will they need to show that learning objects “work?” (are worth staff time and/or other resources) • How and when will you gather that evidence?

  25. Q&A Diane J. Goldsmith Dean Planning, Research, and Assessment CT Distance Learning Consortium Dgoldsmith@ctdlc.org 860 832 3893

More Related