1 / 60

Part II: Online Learning: Opportunities for Assessment and Evaluation

Part II: Online Learning: Opportunities for Assessment and Evaluation. Dr. Curtis J. Bonk Indiana University and CourseShare.com http://php.indiana.edu/~cjbonk cjbonk@indiana.edu. Online Student Assessment.

jalia
Télécharger la présentation

Part II: Online Learning: Opportunities for Assessment and Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part II: Online Learning: Opportunities for Assessment and Evaluation • Dr. Curtis J. Bonk • Indiana University and CourseShare.com • http://php.indiana.edu/~cjbonk • cjbonk@indiana.edu

  2. Online StudentAssessment

  3. Assessment Takes Center Stage in Online Learning(Dan Carnevale, April 13, 2001, Chronicle of Higher Education) “One difference between assessment in classrooms and in distance education is that distance-education programs are largely geared toward students who are already in the workforce, which often involves learning by doing.”

  4. Focus of Assessment? • Basic Knowledge, Concepts, Ideas • Higher-Order Thinking Skills, Problem Solving, Communication, Teamwork • Both of Above!!! • Other…

  5. Assessments Possible • Online Portfolios of Work • Discussion/Forum Participation • Online Mentoring • Weekly Reflections • Tasks Attempted or Completed, Usage, etc.

  6. More Possible Assessments • Quizzes and Tests • Peer Feedback and Responsiveness • Cases and Problems • Group Work • Web Resource Explorations & Evaluations

  7. Richness Coherence Elaboration Relevancy Timeliness Completeness Persuasiveness Originality Insightful Clear/Logical Original Learning Fdback/Responsive Format Thorough Reflective Overall Holistic Sample Portfolio Scoring Dimensions(10 pts each)(see: http://php.indiana.edu/~cjbonk/p250syla.htm)

  8. E-Peer Evaluation Form Peer Evaluation. Name: ____________________ Rate on Scale of 1 (low) to 5 (high): ___ 1. Insight: creative, offers analogies/examples, relationships drawn, useful ideas and connections, fosters growth. ___ 2. Helpful/Positive: prompt feedback, encouraging, informative, makes suggestions & advice, finds, shares info. ___ 3. Valuable Team Member: dependable, links group members, there for group, leader, participator, pushes group. ___ Total Recommended Contribution Pts(out of 15)

  9. E-Case Analysis Evaluation Peer Feedback Criteria (1 pt per item; 5 pts/peer feedback) (a) Provides additional points that may have been missed. (b) Corrects a concept, asks for clarification where needed, debates issues, disagrees & explains why. (c) Ties concepts to another situation or refers to the text or coursepack. (d) Offer valuable insight based on personal experience. (e) Overall constructive feedback.

  10. Issues to Consider… • Bonus pts for participation? • Peer evaluation of work? • Assess improvement? • Is it timed? Give unlimited time to complete? • Allow retakes if lose connection? How many retakes?

  11. Issues to Consider… • Cheating? Is it really that student? • Authenticity? • Negotiating tasks and criteria? • How measure competency? • How do you demonstrate learning online?

  12. Increasing Cheating Online($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?) • http://www.academictermpapers.com/ • http://www.termpapers-on-file.com/ • http://www.nocheaters.com/ • http://www.cheathouse.com/uk/index.html • http://www.realpapers.com/ • http://www.pinkmonkey.com/ (“you’ll never buy Cliffnotes again”)

  13. Reducing Cheating Online • Ask yourself, why are they cheating? • Do they value the assignment? • Are tasks relevant and challenging? • What happens to the task after submitted—reused, woven in, posted? • Due at end of term? Real audience? • Look at pedagogy b4 calling plagiarism police!

  14. Reducing Cheating Online • Proctored exams • Vary items in exam • Make course too hard to cheat • Try Plagiarism.com ($300) • Use mastery learning for some tasks • Random selection of items for item pool • Use test passwords, rely on IP# screening • Assign collaborative tasks

  15. Reducing Cheating Online($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?) • http://www.plagiarism.org/ (resource) • http://www.turnitin.com/ (software, $100, free 30 day demo/trial) • http://www.canexus.com/ (software; essay verification engine, $19.95) • http://www.plagiserve.com/ (free database of 70,000 student term papers & cliff notes) • http://www.academicintegrity.org/ (assoc.) • http://sja.ucdavis.edu/avoid.htm (guide)

  16. Turnitin Testimonials "Many of my students believe that if they do not submit their essays, I will not discover their plagiarism. I will often type a paragraph or two of their work in myself if I suspect plagiarism. Every time, there was a "hit." Many students were successful plagiarists in high school. A service like this is needed to teach them that such practices are no longer acceptable and certainly not ethical!”

  17. New Zealand Universities Consider Lawsuit Against Sites Selling Diplomas in Their Names. • The Web sites, which already offer fake diplomas in the names of hundreds of colleges in the United States and abroad, recently added New Zealand’s Universities of Auckland, Canterbury, and Otago to their lineup. The degrees sell for up to $250 each. • Feb 11, 2002, David Cohen, Chronicle of Higher Education

  18. Online Testing Tools

  19. Choice: Select companies that specialize in online assessment.

  20. Or: Use what the courseware package gives ya…

  21. Test Selection Criteria (Hezel, 1999) • Easy to Configure Items and Test • Handle Symbols • Scheduling of Feedback (immediate?) • Provides Clear Input of Exam Dates • Easy to Pick Items for Randomizing • Randomize Answers Within a Question • Weighting of Answer Options

  22. More Test Selection Criteria • Recording of Multiple Submissions • Timed Tests • Comprehensive Statistics • Summarize in Portfolio and/or Gradebook • Confirmation of Test Submission

  23. More Test Selection Criteria(Perry & Colon, 2001) • Supports multiple items types—multiple choice, true-false, essay, keyword • Can easily modify or delete items • Incorporate graphic or audio elements? • Control over number of times students can submit an activity or test • Provides feedback for each response

  24. More Test Selection Criteria(Perry & Colon, 2001) • Flexible scoring—score first, last, or average submission • Flexible reporting—by individual or by item and cross tabulations. • Outputs data for further analysis • Provides item analysis statistics (e.g., Test Item Frequency Distributions). Web Resource: http://www.indiana.edu/~best/

  25. Online Survey Tools for Assessment

  26. Sample Survey Tools • Zoomerang (http://www.zoomerang.com) • IOTA Solutions (http://www.iotasolutions.com) • QuestionMark(http://www.questionmark.com/home.html) • SurveyShare (http://SurveyShare.com; from Courseshare.com) • Survey Solutions from Perseus (http://www.perseusdevelopment.com/fromsurv.htm) • Infopoll (http://www.infopoll.com)

  27. Web-Based Survey Advantages • Faster collection of data • Standardized collection format • Computer graphics may reduce fatigue • Computer controlled branching and skip sections • Easy to answer clicking • Wider distribution of respondents

  28. Web-Based Survey Problems: Why Lower Response Rates? • Low response rate • Lack of time • Unclear instructions • Too lengthy • Too many steps • Can’t find URL

  29. Survey Tool Features • Support different types of items (Likert, multiple choice, forced ranking, paired comparisons, etc.) • Maintain email lists and email invitations • Conduct polls • Adaptive branching and cross tabulations • Modifiable templates & library of past surveys • Publish reports • Different types of accounts—hosted, corporate, professional, etc.

  30. Web-Based Survey Solutions: Some Tips… • Send second request • Make URL link prominent • Offer incentives near top of request • Shorten survey, make attractive, easy to read • Credible sponsorship—e.g., university • Disclose purpose, use, and privacy • E-mail cover letters • Prenotify of intent to survey

  31. Tips on Authentification • Check e-mail access against list • Use password access • Provide keycode, PIN, or ID # • (Futuristic Other: Palm Print, fingerprint, voice recognition, iris scanning, facial scanning, handwriting recognition, picture ID)

  32. Evaluation…

  33. Champagne & Wisher (in press) “Simply put, an evaluation is concerned with judging the worth of a program and is essentially conducted to aid in the making of decisions by stakeholders.” (e.g., does it work as effectively as the standard instructional approach).

  34. Evaluation Purposes • Cost Savings • Improved Efficiency/Effectiveness • Learner Performance/Competency Improvement/Progress • What did they learn? • Assessing learning impact • How well do learners use what they learned? • How much do learners use what they learn?

  35. Kirkpatrick’s 4 Levels • Reaction • Learning • Behavior • Results

  36. My Evaluation Plan…

  37. What to Evaluate? • Student—attitudes, learning, jobs. • Instructor—popularity, course enrollments. • Training—internal and external. • Task--relevance, interactivity, collaborative. • Tool--usable, learner-centered, friendly, supportive. • Course—interactivity, completion rates. • Program—growth, long-range plans. • University—cost-benefit, policies, vision.

  38. Measures of Student Success(Focus groups, interviews, observations, surveys, exams, records) • Positive Feedback, Recommendations • Increased Comprehension, Achievement • High Retention in Program • Completion Rates or Course Attrition • Jobs Obtained, Internships • Enrollment Trends for Next Semester

  39. 1. Student Basic Quantitative • Grades, Achievement • Number of Posts • Participated • Computer Log Activity—peak usage, messages/day, time of task or in system • Attitude Surveys

  40. 1. Student High-End Success • Message complexity, depth, interactivity, q’ing • Collaboration skills • Problem finding/solving and critical thinking • Challenging and debating others • Case-based reasoning, critical thinking measures • Portfolios, performances, PBL activities

  41. 2. Instructor Success • High student evals; more signing up • High student completion rates • Utilize Web to share teaching • Course recognized in tenure decisions • Varies online feedback and assistance techniques

  42. 3. TrainingOutside Support • Training (FacultyTraining.net) • Courses & Certificates (JIU, e-education) • Reports, Newsletters, & Pubs • Aggregators of Info(CourseShare, Merlot) • Global Forums (FacultyOnline.com; GEN) • Resources, Guides/Tips, Link Collections, Online Journals, Library Resources

  43. Certified Online Instructor Program • Walden Institute—12 Week Online Certification (Cost = $995) • 2 tracks: one for higher ed and one for online corporate trainer • Online tools and purpose • Instructional design theory & techniques • Distance ed evaluation • Quality assurance • Collab learning communities

  44. http://www.utexas.edu/world/lecture/

More Related