1 / 96

What Have We Learned From the Research on Online Learning?

What Have We Learned From the Research on Online Learning?. Dr. Curtis J. Bonk Professor, Indiana University President, CourseShare and SurveyShare http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu. Tons of Recent Research. Not much of it ...is any good. Basic Distance Learning Finding?.

Télécharger la présentation

What Have We Learned From the Research on Online Learning?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Have We Learned From the Research on Online Learning? Dr. Curtis J. Bonk Professor, Indiana University President, CourseShare and SurveyShare http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu

  2. Tons of Recent Research Not much of it ...is any good...

  3. Basic Distance Learning Finding? • Research since 1928 shows that DL students perform as well as their counterparts in a traditional classroom setting. Per: Russell, 1999, The No Significant Difference Phenomenon (5th Edition), NCSU, based on 355 research reports. http://cuda.teleeducation.nb.ca/nosignificantdifference/

  4. Online Learning Research Problems (National Center for Education Statistics, 1999; Phipps & Merisotos, 1999; Wisher et al., 1999). • Anecdotal evidence; minimal theory. • Questionable validity of tests. • Lack of control group. • Hard to compare given different assessment tools and domains. • Fails to explain why the drop-out rates of distance learners are higher. • Does not relate learning styles to different technologies or focus on interaction of multiple technologies.

  5. Online Learning Research Problems(Bonk & Wisher, 2001) • For different purposes or domains: in our study, 13% concern training, 87% education • Flaws in research designs - Only 36% have objective learning measures - Only 45% have comparison groups • When effective, it is difficult to know why - Course design? - Instructional methods? - Technology?

  6. Evaluating Web-Based Instruction:Methods and Findings (41 studies)(Olson & Wisher, October, 2002; International Review of Research in Open and Distance Learning) http://www.irrodl.org/content/v3.2/olsen.html

  7. Web Based Instruction CBI Kulik [8] CBI Liao [18] Average Effect Size .31 .32 .41 Number of Studies 11 97 46 Wisher’s Wish List • Effect size of .5 or higher in comparison to traditional classroom instruction.

  8. Evaluating Web-Based Instruction: Methods and Findings(Olson & Wisher, 2002) “…there is little consensus as to what variables should be examined and what measures of of learning are most appropriate, making comparisons between studies difficult and inconclusive.” e.g., demographics (age, gender), previous experience, course design, instructor effectiveness, technical issues, levels of participation and collaboration, recommendation of course, desire to take add’l online courses.

  9. Evaluating Web-Based Instruction: Methods and Findings(Olson & Wisher, 2002) Variables Studied: • Type of Course: Graduate (18%) vs. undergraduate courses (81%) • Level of Web Use: All-online (64%) vs. blended/mixed courses (34%) • Content area (e.g., math/engineering (27%), science/medicine (24%), distance ed (15%), social science/educ (12%), business (10%), etc.) • Attrition data (34%) • Comparison Group (59%)

  10. Some of the Research Gaps(Bonk & Wisher, 2000) 1) Variations in Instructor Moderation 2) Online Debating 3) Student Perceptions of e-Learning Envir. 4) Devel of Online Learning Communities 5) Time Allocation: Instructor and Student 6) Critical Thinking and Problem Solving Applications in Sync/Asynchronous Envir 7) Peer Tutoring and Online Mentoring: 8) Student Retention: E-learning and Attrition 9) Graphical Representation of Ideas 10) Online Collaboration

  11. Compare Higher Ed and Corp

  12. 1. Research in Higher Ed

  13. My Evaluation Plan…

  14. Electronic Conferencing: Quantitative Analyses • Usage patterns, # of messages, cases, responses • Length of case, thread, response • Average number of responses • Timing of cases, commenting, responses, etc. • Types of interactions (1:1; 1: many) • Data mining (logins, peak usage, location, session length, paths taken, messages/day/week), Time-Series Analyses (trends)

  15. Electronic Conferencing: Qualitative Analyses • General: Observation Logs, Reflective interviews, Retrospective Analyses, Focus Groups • Specific: Semantic Trace Analyses, Talk/Dialogue Categories (Content talk, questioning, peer feedback, social acknowledgments, off task) • Emergent:Forms of Learning Assistance, Levels of Questioning, Degree of Perspective Taking, Case Quality, Participant Categories

  16. Student Basic Quantitative • Grades, Achievement Test Scores, etc. • Number of Posts • Overall Participation • Computer Log Activity—peak usage, messages/day, time of task or in system • Attitude Surveys

  17. Student High-End Success • Message complexity, depth, interactivity, questioning • Collaboration skills • Problem finding/solving and critical thinking • Challenging and debating others • Case-based reasoning, critical thinking measures • Portfolios, performances, PBL activities

  18. Other Measures of Student Success(Focus groups, interviews, observations, surveys, exams, records) • Positive Feedback, Recommendations • Increased Comprehension, Achievement • High Retention in Program • Completion Rates or Course Attrition • Jobs Obtained, Internships • Enrollment Trends for Next Semester

  19. Findings: Learning Improved(Maki et al., 2000) • Intro to Psych: Lecture vs. Online • Online performed better on midterms. • Web-based course students scored higher since had weekly activities due • Lecture students could put off reading until night before exam.

  20. Findings: Learning Improved(review by Chang, 2003) • Online outperformed peers in histology (anatomy—plant and animal tissues under microscope) course (Shoenfeld-Tacher et al., 2001) • Web enhancements raised exam performance, grades, & attitudes toward economics • Agarwal and Day (1998) • Online business communications students performed better on final exams than on campus (Tucker, 2000)

  21. Integrating Wireless Content Syllabus Magazine, May 13, 2003 • Study by Mobile Learning Corp: group of college institutions • Digital content helped first-year college accounting students learn • Online interactive exercises useful to student learning • Encouraged independent student learning, and instructors to adopt coaching role.

  22. Findings: Learning Worse(Wang & Newlin, 2000) • Stat Methods: Lecture vs. Online • No diffs at midterm • Lecture 87 on final, Web a 72 • Course relatively unstructured • Web students encouraged to collab • Lecture students could not collab • All exams but final were open book

  23. Findings: Learning Worse…Organizational Behavior, IUSE(Keefe, Educause Quarterly, 1, 2003) • Keefe studied 4 semesters of courses, 6 sections, 118 students • Face-to-face more satisfied with course and instructor • Those in online course associated with lower grades

  24. Learning Improved or Not?(Sankaran et al., 2000) • Students with a positive attitude toward Web format learned more in Web course than in lecture course. • Students with positive attitude toward lecture format learned more in lecture format.

  25. Contrasting Findings are the Norm • Some courses impersonal, isolating, and frustrating (Hara & Kling, 2001) • Sense of community and lower attrition rates when support interactivity, reflection, and sharing (Harnishfeger, March, 2003)

  26. Problem-Based LearningDistance Ed, 23(1), 2002 Practical learning issues generated more interactions and higher levels of interaction than theoretical issues Communities of learners need to negotiate identity and knowledge and need milestones (chat session agreements, producing reports, sharing stories, and new work patterns) Group development: (1) negotiate problem and timetable, (2) divide work in subgroups, and (3) produce drafts of products

  27. Network Conferencing Interactivity (Rafaeli & Sudweeks, 1997) 1. > 50 percent of messages were reactive. 2. Only around 10 percent were truly interactive. 3. Most messages factual stmts or opinions 4. Many also contained questions or requests. 5. Frequent participators more reactive than low. 6. Interactive messages more opinions & humor. 7. More self-disclosure, involvement, & belonging. 8. Attracted to fun, open, frank, helpful, supportive environments.

  28. Schallert & Reed, AERA, April 2003 • Nonnative students do not participate equally in written discussions • Enthusiastic and frequent contributors do not necessarily make intellectually significant contributions. • Some who seem deeply engaged may be less rigorously engaged in many conversations

  29. Collaborative Behaviors(Curtis & Lawson, 1997) • Most common were: (1) Planning, (2) Contributing, and (3) Seeking Input. • Other common events were: (4) Initiating activities, (5) Providing feedback, (6) Sharing knowledge • Few students challenge others or attempt to explain or elaborate • Recommend: using debates and modeling appropriate ways to challenge others

  30. Dimensions of Learning Process(Henri, 1992) 1. Participation (rate, timing, duration of messages) 2. Interactivity (explicit interaction, implicit interaction, & independent comment) 3. Social Events (stmts unrelated to content) 4. Cognitive Events (e.g., clarifications, inferencing, judgment, and strategies) 5. Metacognitive Events (e.g., both metacognitive knowledge—person, and task, and strategy and well as metacognitive skill—evaluation, planning, regulation, and self-awareness)

  31. Surface Processing making judgments without justification, stating that one shares ideas or opinions already stated, repeating what has been said asking irrelevant questions i.e., fragmented, narrow, and somewhat trite. In-depth Processing linked facts and ideas, offered new elements of information, discussed advantages and disadvantages of a situation, made judgments that were supported by examples and/or justification. i.e., more integrated, weighty, and refreshing. Surface vs. Deep Posts(Henri, 1992)

  32. Critical Thinking (Newman, Johnson, Webb & Cochrane, 1997) Used Garrison’s five-stage critical thinking model • Critical thinking in both CMC and FTF envir. • Depth of critical thinking higher in CMC envir. • More likely to bring in outside information • Link ideas and offer interpretations, • Generate important ideas and solutions. • FTF settings were better for generating new ideas and creatively exploring problems.

  33. Social Construction of Knowledge(Gunawardena, Lowe, & Anderson, 1997) • Five Stage Model 1. Share ideas, 2. Discovery of Idea Inconsistencies, 3. Negotiate Meaning/Areas Agree, 4. Test and Modify, 5. Phrase Agreements • In global debate, very task driven. • Dialogue remained at Phase I: sharing info

  34. Research on Instructors Online • If teacher-centered, students explore less, engage less, interact less (Peck, and Laycock, 1992) • Informal, exploratory conversation fosters risktaking & knowledge sharing (Weedman, 1999) • Online Teaching Job Varies--Plan, Interaction, Admin, Teaching • (McIsaac, Blocher, Mahes, & Vrasidas, 1999)

  35. Three Most Vital Online Teaching SkillsThe Online Teacher, TAFE, Guy Kemshal-Bell (April, 2001) • Ability to engage the learner (30) • Ability to motivate online learners (23) • Ability to build relationships (19) • Technical ability (18) • Having a positive attitude (14) • Adapt to individual needs (12) • Innovation or creativity (11)

  36. Feelings Toward Online TeachingThe Online Teacher, TAFE, Guy Kemshal-Bell (April, 2001)(Note: 94 practitioners surveyed.) • Exciting (30) • Challenging (24) • Time consuming (22) • Demanding (18) • Technical issue (16); Flexibility (16) • Potential (15) • Better options (14); Frustrating (14) • Collab (11); Communication (11); Fun (11)

  37. Little or no feedback given Always authoritative Kept narrow focus of what was relevant Created tangential discussions Only used “ultimate” deadlines Provided regular qual/quant feedback Participated as peer Allowed perspective sharing Tied discussion to grades, other assessments. Used incremental deadlines Dennen’s Research on Nine Online Courses (sociology, history, communications, writing, library science, technology, counseling) Poor Instructors Good Instructors

  38. Role of Online Teacher(Bonk, Kirkley, Hara, & Dennen, 2001) • Technical—Train, early tasks, be flexible, orientation task • Managerial—Initial meeting, FAQs, detailed syllabus, calendar, post administrivia, assign e-mail pals, gradebooks, email updates • Pedagogical—Peer feedback, debates, PBL, cases, structured controversy, field reflections, portfolios, teams, inquiry, portfolios • Social—Café, humor, interactivity, profiles, foreign guests, digital pics, conversations, guests

  39. Tasks Overwhelm Confused on Web Too Nice Due to Limited Share History Lack Justification Hard not to preach Too much data Communities not easy to form Train and be clear Structure time/dates due Develop roles and controversies Train to back up claims Students take lead role Use Email Pals Embed Informal/Social Problems and Solutions(Bonk, Wisher, & Lee, in press)

  40. Shy open up online Minimal off task Delayed collab more rich than real time Students can generate lots of info Minimal disruptions Extensive E-Advice Excited to Publish Use async conferencing Create social tasks Use Async for debates; Sync for help, office hours Structure generation and force reflection/comment Foster debates/critique Find Experts or Prac. Ask Permission Benefits and Implications(Bonk, Wisher, & Lee, in press)

  41. More Implications • Include Variety: tasks, topics, participants, accomplishments, etc. • Make interaction extend beyond class • Have learners be teachers • Find multiple ways to succeed • Add personalization and choice • Provide clarity and easy navigation

  42. Ten Ways Online Ed Matches or Surpasses FTF, Mark Kassop, Technology Source, Michigan Virtual Univ, May/June 2003 • Student-centered learning • Writing intensity • Highly interactive discussions • Geared for lifelong learning • Enriched course materials • Online demand interaction and support • Immediate feedback • Flexibility • An intimate community of learners • Faculty development and rejuvenation

  43. 2. Research and Evaluation in Corporate Settings

  44. Collecting Evaluation Data • Learner Reaction • Learner Achievement • Learner Job Performance • Manager Reaction • Productivity Benchmarks

  45. Forms of Evaluation • Interviews and Focus Groups • Self-Analysis • Supervisor Ratings • Surveys and Questionnaires • ROI • Document Analysis • Data Mining (Changes in pre and post-training; e.g., sales, productivity)

  46. What is Evaluation??? “Simply put, an evaluation is concerned with judging the worth of a program and is essentially conducted to aid in the making of decisions by stakeholders.” (e.g., does it work as effectively as the standard instructional approach). (Champagne & Wisher, in press)

  47. Meta-Analysis: Recurrent Themes in E-Learning Reports(Waight, Willging, & Wentling, 2002) • 250 e-learning reports from 1999-2001 • Of those, 100 were sold by private companies for $100-3,000 • Of remaining 150, 70 outside U.S. • 15 selected were from government, bus, and professional associations • Few studied review existing research

  48. Meta-Analysis: Six Functions of E-Learning(Waight, Willging, & Wentling, 2002) • Anytime, anywhere • Cost effective • Global reach • Just-in-time • Allow personalization • Improve collaboration and interactivity • Address learner diversity, learner-centered, and blur working and learning lines

More Related