1 / 32

Using Learner Analytics to Understand Student Achievement in a Large Enrollment Hybrid Course

Using Learner Analytics to Understand Student Achievement in a Large Enrollment Hybrid Course. John Whitmer, Ed.D. Updated: February 19, 2013. Outline. Context Methods & Tools Findings Conclusions & Next Steps. 1. Context. Case Study: Intro to Religious Studies.

payton
Télécharger la présentation

Using Learner Analytics to Understand Student Achievement in a Large Enrollment Hybrid Course

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Learner Analytics to Understand Student Achievement in a Large Enrollment Hybrid Course John Whitmer, Ed.D. Updated: February 19, 2013

  2. Outline • Context • Methods & Tools • Findings • Conclusions & Next Steps

  3. 1. Context

  4. Case Study: Intro to Religious Studies • Undergraduate, introductory, high demand • Redesigned to hybrid delivery format through “academy eLearning program” • Enrollment: 373 students (54% increase on largest section) • Highest LMS (Vista) usage entire campus Fall 2010 (>250k hits) • Bimodal outcomes: • 10% increase on final exam • 7% & 11% increase in DWF • Why? Can’t tell with aggregated data 54 F’s

  5. Founded in 1887 • 15,257 FTES, 95% from California, serves 12 counties • Primarily residential, undergraduate teaching college • Campus in California State University system (23 colleges, 44,000 faculty and staff, 437,000 students)

  6. CSU Budget Proposed Increase! Source: CSU Chancellor’s Officehttp://bit.ly/X7LYeK

  7. Driving Conceptual Questions • How is student LMS use related to academic achievement in a single course section? • How does that finding compare to the relationship of achievement with traditional student characteristic variables? • How are these relationships different for “at-risk” students (URM & Pell-eligible)? • What data sources, variables and methods are most useful to answer these questions?

  8. 2. Methods & TOols

  9. Methods at a Glance • Data Sources: 1) LMS logfiles, 2) SIS data, 3) Course data • Process • Clean/filter/transform/reduce data (70% effort) • Descriptive / exploratory analysis (20% effort) • Statistical analysis (10% effort) • Factor analysis • Correlation single variables • Regression multiple variables; partial & complete

  10. Tools Used

  11. Variables

  12. Missing Data On Critical Indicators

  13. Final data set: 72,000 records (-73%)

  14. LMS Use Consistent across Categories Factor Analysis of LMS Use Categories

  15. 3. Findings

  16. Clear Trend: Grade w/Mean LMS Hits

  17. Question 1 Results: Correlation LMS Use w/Final Grade Scatterplot of Assessment Activity Hits vs. Course Grade

  18. Question 2 Results: Correlation: Student Char. w/Final Grade Scatterplot of HS GPA vs. Course Grade

  19. Question 2 Results: Correlation: Student Char. w/Final Grade

  20. Conclusion: LMS Use Variables better Predictors than Student Characteristics LMS Use Variables18% Average(r = 0.35–0.48)Explanation of change in final grade Student Characteristic Variables 4% Average(r = -0.11–0.31) Explanation of change in final grade >

  21. SmallestLMS Use Variable(Administrative Activities) r = 0.35 Largest Student Characteristic (HS GPA) r = 0.31 >

  22. Combined Variables Regression Final Grade by LMS Use & Student Characteristic Variables LMS Use Variables25% (r2=0.25)Explanation of change in final grade Student Characteristic Variables +10%(r2=0.35) Explanation of change in final grade >

  23. Question 3 Results:Regressionby “At Risk” Population Subsamples

  24. At-Risk Students: “Over-Working Gap”

  25. Activities by Pell and Grade Extra effort in content-related activities

  26. Previous Studies Relating LMS Use to Course Grade

  27. 4. Conclusions & Next Steps

  28. Conclusions • At the course level, LMS use better predictor of academic achievement than student demographics (what do, not who are). • Small strength magnitude of complete model demonstrates relevance of data, but suggests that better methods could produce stronger results. • LMS data requires extensive filtering to be useful; student variables need pre-screening for missing data.

  29. More Conclusions • LMS use frequency is a proxy for effort. Not a very complex indicator. • Student demographic measures need revision for utility in Postmodern era (importance to student, more frequent sampling, etc.). • LMS effectiveness for at-risk students may be caused by non-technical barriers. Need additional research!

  30. Ideas & Feedback Potential for improved LMS analysis methods: • social learning • activity patterns • discourse content analysis • time series analysis Group students by broader identity, with unique variables: • Continuing student (Current college GPA, URM, etc. • First-time freshman (HS GPA, SAT/Act, etc)

  31. Feedback? Questions? John Whitmer jwhitmer@calstate.edu Slideshttp://slidesha.re/15iokzE Complete monographhttp://bit.ly/15ijySP Twitter: johncwhitmer

More Related