1 / 32

Intelligent Learning Object Guide (iLOG)

Intelligent Learning Object Guide (iLOG). A Framework for Automatic Empirically-Based Metadata Generation. S.A. Riley a , L.D. Miller a , L.-K. Soh a , A. Samal a , and G. Nugent b a University of Nebraska—Lincoln: Department of Computer Science and Engineering

lyle
Télécharger la présentation

Intelligent Learning Object Guide (iLOG)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intelligent Learning Object Guide (iLOG) A Framework for Automatic Empirically-Based Metadata Generation S.A. Rileya, L.D. Millera, L.-K. Soha, A. Samala, and G. Nugentb aUniversity of Nebraska—Lincoln: Department of Computer Science and Engineering bUniversity of Nebraska—Lincoln: Center for Research on Children, Youth, Families and Schools

  2. Overview • Introduction: • What is a Learning Object (LO)? • Why do we need LO metadata? • Metadata problems and iLOG solution • iLOG Framework • LO Wrapper • MetaGen (metadata generator) • Data Logging • Data Extraction • Data Analysis (feature selection, rule mining, statistics) • Conclusions and Future Work Intelligent Learning Object Guide

  3. Introduction: What is a learning object? • Self-contained learning content • Ideally, each covers a single topic • Serve as building blocks for lessons, modules, or courses • Can be reused in multiple instructional contexts LO Metadata iLOG Learning Object structure: • Content: tutorial, exercises, assessment • Metadata Learning Object Intelligent Learning Object Guide

  4. Introduction: What is a learning object? • The iLOG LOs contain a tutorial, exercises, and assessment • Each covers a ‘bite-sized’ introductory computer science topic Tutorial Exercises Assessment Intelligent Learning Object Guide

  5. Introduction: Why do we need LO metadata? • Repositories for LOs are being constructed • However, there are barriers to effective utilization of these repositories: • Learning Context: not all LOs, even on the same topic, are suitable for use in a given learning context • Uncertainty: wecannot be certain what will happen with real-world usage • Search and Retrieval: current metadata is not machine-readable, and thus is not adequate to automate the search for LOs LO Metadata LO Metadata LO Metadata LO Metadata LO Metadata Learning Object Learning Object Learning Object Learning Object Learning Object LO Repository Intelligent Learning Object Guide

  6. Introduction: Why do we need LO metadata? Learning Context: • Students are highly varied: • Pre-existing knowledge, cultural background, motivation, self-efficacy, etc. Uncertainty: • Cannot be certain what will happen when actual students use an actual LO: • Good for students with low self-efficacy • Inherent gender bias • Bad for students without Calculus experience Search and Retrieval: • Metadata is fundamental to an instructor’s ability to use LOs: • Guide in the LO selection process • Help prevent the feeling that e-learning is ‘too complicated’ Intelligent Learning Object Guide

  7. So… …how do we enable instructors to locate appropriate LOs for their students??? Intelligent Learning Object Guide

  8. Introduction: Metadata problems and iLOG solution Current Metadata Ideal Metadata Manual generation by course designer Based only on designer intuition Metadata format inconsistent / incomplete Human- but not machine- readable Automatedgeneration Based on empirical usage Consistent metadata suitable for guiding LO selection Both human and machine-readable • Current metadata standards are insufficient (Freisen, 2008) • There are ample opportunities for making e-learning more “intelligent” (Brooks et al., 2006) Intelligent Learning Object Guide

  9. Introduction: Metadata problems and iLOG solution The iLOG solution is: • General: iLOG is based on established learning standards • We use the SCORM learning object standard, the IEEE LOM metadata standard, and the Blackboard LMS • Furthermore, it is compatible with existing LOs and does not require modification to the LOs (noninvasive) • The iLOG framework can also be applied to other standards • Automatic: iLOG metadata is automatically generated and updated • Interpretable: iLOG metadata is both human and machine readable Intelligent Learning Object Guide

  10. Introduction: Metadata problems and iLOG solution • LO Wrapper: logs student behaviors when using LO • MetaGen: generates empirical usage metadata using data mining techniques • Works noninvasively with pre-existing LOs using standard learning management systems (LMSs) LO Wrapper LO Wrapper LO Wrapper LO Metadata LO Metadata LO Metadata Learning Management System (LMS) Learning Object Learning Object Learning Object Intelligent Learning Object Guide

  11. Related Work • Automatic metadata generation • Primarily focuses on content taxonomies (Roy et al., 2008; Jovanovic et al., 2006) • Mining student behavior log files • Mining has been shown to have a positive impact on instruction and learning (Kobsa et al., 2007) • Standardization of educational log file data • Significant progress has been made with tutor-message format standard (PSLC DataShop) Intelligent Learning Object Guide

  12. Overview • Introduction: • What is a Learning Object (LO)? • Why do we need LO metadata? • Metadata problems and iLOG solution • iLOG Framework • LO Wrapper • MetaGen (metadata generator) • Data Logging • Data Extraction • Data Analysis (feature selection, rule mining, statistics) • Conclusions and Future Work Intelligent Learning Object Guide

  13. iLOG Framework MetaGen Rules and Statistics Statistics Generation LO Wrapper LO Metadata iLOG dataset Learning Object Data- base Data Analysis Data Extraction Data Logging Rule Mining Feature Selection Feature Subset Log Files and Existing Metadata Two components: LO Wrapper and MetaGen Intelligent Learning Object Guide

  14. iLOG Framework: LO wrapper LO Wrapper: • ‘Wraps’ around an existing LO • Intercepts student interactions and logs them to a database • Does not require changing the LO LO Wrapper LO Metadata Learning Object Intelligent Learning Object Guide

  15. iLOG Framework: MetaGen MetaGen Rules and Statistics Statistics Generation iLOG dataset Data- base Data Analysis Data Extraction Data Logging Rule Mining Feature Subset Feature Selection • MetaGen modules: • Data Logging, Data Extraction, Data Analysis Intelligent Learning Object Guide

  16. iLOG Framework: MetaGen—Logging MetaGen LO Wrapper LO Metadata Data- base Learning Object Data Logging Log Files • Potential data sources: • Interactions: clicks, time spent, etc. • Surveys: demographic, motivation, self-efficacy, evaluation • Assessment scores Intelligent Learning Object Guide

  17. iLOG Framework: MetaGen—Logging • Data sources used in our iLOG deployment: Intelligent Learning Object Guide

  18. iLOG Framework: MetaGen—Extraction Data Extraction: • Uses Java application to query the relational database and extract a ‘flat dataset’ suitable for data mining: • Student Behaviors: Average time per tutorial page, Total time on assessment, etc. • Student Characteristics: Total motivation self-rating, GPA, Gender, etc. MetaGen LO Wrapper LO Metadata iLOG dataset Data- base Learning Object Data Extraction Data Logging Log Files and Existing Metadata Intelligent Learning Object Guide

  19. iLOG Framework: MetaGen—Analysis Data Analysis (feature selection): • Uses ensemble of feature selection algorithms • Seeks to identify student behaviors and characteristics that are relevant to learning outcomes MetaGen LO Wrapper LO Metadata iLOG dataset Data- base Learning Object Feature Subset Feature Selection Data Analysis Data Extraction Data Logging Log Files and Existing Metadata Intelligent Learning Object Guide

  20. iLOG Framework: MetaGen—Analysis • Feature selection (FS) is used to find a subset of variables (features) that is sufficient to describe a dataset (Guyon et al., 2003) • Different techniques may generate different results • Instead, our goal was to find ALL features relevant to learning outcomes • Thus, the feature selection ensemble members ‘vote’ on which features they identify as most relevant FS#1 FS#3 FS#2 All features Intelligent Learning Object Guide

  21. iLOG Framework: MetaGen—Analysis Notable Results: • Relevant features varied widely across LOs • Discovered unexpected patterns: • Possible gender bias , Calculus bias, etc. Intelligent Learning Object Guide

  22. iLOG Framework: MetaGen—Analysis MetaGen LO Wrapper LO Metadata iLOG dataset Data- base Learning Object Rule Mining Feature Subset Feature Selection Data Analysis Data Extraction Data Logging Log Files and Existing Metadata • Rule Mining: • Uses Tertius algorithm for predictive rule mining • Generates rules from selected features (along with rule strength) takenCalculus? = no fail (.52) currentTotalMotivationAboveAvg? = no fail (.52) gender = female  fail (.36) Intelligent Learning Object Guide

  23. iLOG Framework: MetaGen—Analysis MetaGen Statistics Generation LO Wrapper LO Metadata iLOG dataset Data- base Learning Object Rule Mining Feature Subset Feature Selection Data Analysis Data Extraction Data Logging Log Files and Existing Metadata • Statistics Generation: • Empirical data: time to complete, pass/fail rates, and student ratings of LO successRate = 51% averageTime = 433 seconds averageStudentRating = 4.3/5.0 Intelligent Learning Object Guide

  24. iLOG Framework: MetaGen—Analysis • Appear to be different predictors of success for different learning contexts: • Honors: student impression of LO, gender • Majors: motivation, math experience • Non-majors: long time spent on assessment, math experience, gender Intelligent Learning Object Guide

  25. iLOG Framework: MetaGen—Analysis • Inverse relationship: time spent on LO and student ratings: • Advanced students may have higher expectations (lower ratings) • Advanced students may care more about the material (time spent) Intelligent Learning Object Guide

  26. iLOG Framework: MetaGen—Analysis MetaGen Statistics Generation Rules and Statistics LO Wrapper LO Metadata iLOG dataset Data- base Learning Object Rule Mining Feature Subset Feature Selection Data Analysis Data Extraction Data Logging Log Files and Existing Metadata • Rules and Statistics: • Usage statistics and rules are combined to form empirical usage metadata Intelligent Learning Object Guide

  27. iLOG Framework: Our Implementation LO wrapper: • HTML document that uses Java-script to record and timestamp student interactions with the LO (e.g., page navigation, clicks on a page, etc.). • Uses a modification of the Easy SCO Adapter1 to interface with the SCORM API and retrieve student assessment results from the LMS. • Uses JavaScript to transmit interaction data to MetaGen MetaGen: • Data logging: uses PHP to store student interaction data into a MySQL database.   • Data extraction: uses Java to query the database and process the data into theiLOG dataset.   • Data analysis: uses the Weka (Witten, 2005) implementations of several feature selection algorithms to generate the iLOG data-subset and the (Flach, 2001) predictive rule mining algorithm to generate empirical usage metadata rules. 1[http://www.ostyn.com/standards/demos/SCORM/wraps/easyscoadapterdoc.htm#license] Intelligent Learning Object Guide

  28. Overview • Introduction: • What is a Learning Object (LO)? • Why do we need LO metadata? • Metadata problems and iLOG solution • iLOG Framework • LO Wrapper • MetaGen (metadata generator) • Data Logging • Data Extraction • Data Analysis (feature selection, rule mining, statistics) • Conclusions and Future Work Intelligent Learning Object Guide

  29. Conclusions iLOG: a framework for automatic, empirical metadata generation: • LO Wrapper component: • “Wraps” noninvasively around pre-existing learning objects (LOs) • Automatically collects and logs student interaction data • Resulting LOs can be played on a standard LMS, such as Blackboard • MetaGen component (metadata generator): • Uses data mining to create empirical usage metadata: • Feature selection: provides insights on which student characteristics and behaviors may contribute to success in different learning contexts. • Rule mining: uses salient features to generate rules predicting success • Usage statistics: empirical evidence of time to complete, scores, etc. • iLOG’sempirical use metadata should enable instructors to locate LOs that are appropriate to their students’ learning context Intelligent Learning Object Guide

  30. Future Work: Closing the Loop MetaGen Statistics Generation Rules and Statistics LO Wrapper LO Metadata iLOG dataset Data- base Learning Object Rule Mining Feature Subset Feature Selection Data Analysis Data Extraction Data Logging Log Files and Existing Metadata • Method to automatically write empirical usage metadata to the LO metadata file • Method to integrate new metadata with existing metadata Intelligent Learning Object Guide

  31. References • IEEE 1484.12.1-2002 Standard for Learning Object Metadata (LOM). Retrieved January 7, 2009, from http://ltsc.ieee.org/wg12/files/LOM_1484_12_1_v1_Final_Draft.pdf • N. Friesen, The International Learning Object Metadata Survey. Retrieved August 7, 2008, from http://www.irrodl.org/index.php/irrodl/article/view/195/277/ • C. Brooks, J. Greer, E. Melis, C. Ullrich, Combining ITS and eLearning Technologies: Opportunities and Challenges, Proc. 8th Int. Conf. on Intelligent Tutoring Systems (2006), 278-287. • D. Roy, S Sarkar, S. Ghose, Automatic Extraction of Pedagogic Metadata from Learning Content, Int. J. of Artificial Intelligence in Education 18 (2008), 287-314. • J. Jovanovic, D. Gasevic, V. Devedzic, Ontology-Based Automatic Annotation of Learning Content, Int. J. on Semantic Web and Information Systems, 2(2) (2006), 91-119. • B. Jong, T. Chan, Y. Wu, Learning Log Explorer in E-Learning Diagnosis, IEEE Transactions on Education 50(3) (2007), 216-228. • E. Garcia, C. Romero, S. Ventura, C. Castro, An architecture for making recommendations to courseware authors using association rule mining and collaborative filtering, User Modeling and User-Adaptive Interaction (to appear). • E. Kobsa, V. Dimitrova, R. Boyle, Adaptive Feedback Generation to support teachers in web-based distance education, User Modeling and User-Adapted Interaction17 (2007), 379-413. • I. Guyon, A. Elisseeff, An Introduction to Variable and Feature Selection, Journal of Machine Learning Research3 (2003), 1157-1182. • P.A. Flach, N. Lachiche, Confirmation-Guided Discovery of First-Order Rules with Tertius, Machine Learning42 (2001), 61-95. • Ian H. Witten and Eibe Frank "Data Mining: Practical machine learning tools and techniques",2nd Edition, Morgan Kaufmann, San Francisco, 2005. Intelligent Learning Object Guide

  32. Contact and Acknowledgement iLOG project website: http://cse.unl.edu/agents/ilog Authors: S.A. Rileya, L.D. Millera, L.-K. Soha, A. Samala, and G. Nugentb Email: sriley@cse.unl.edu, lmille@cse.unl.edu, lksoh@cse.unl.edu, samal@cse.unl.edu, gnugent1@unl.edu • This material is based upon work supported by the National Science Foundation under Grant No. 0632642 and an NSF GAANN fellowship. Intelligent Learning Object Guide

More Related