1 / 30

Using Bayesian Networks to Predict Test Scores

Using Bayesian Networks to Predict Test Scores. by Zach Pardos Neil Heffernan, Advisor. Introduction Overview. ASSISTment tutoring system The Task Bayesian networks Platform selection. ASSISTment Tutoring System. Online tutoring system developed at WPI - Assess student knowledge/learning

Télécharger la présentation

Using Bayesian Networks to Predict Test Scores

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Bayesian Networks to Predict Test Scores by Zach Pardos Neil Heffernan, Advisor ASSISTment

  2. Introduction Overview • ASSISTment tutoring system • The Task • Bayesian networks • Platform selection ASSISTment

  3. ASSISTment Tutoring System • Online tutoring system developed at WPI - Assess student knowledge/learning • Assists and prepares students for the MCAS • 2nd year of operation • Participation includes over… • 2,000 students • With 20 teachers/classes • At 6 schools ASSISTment

  4. ASSISTment Tutoring System • Students attempt to answer top level questions based on previous MCAS test questions • If the student answers incorrectly or asks for a “hint” they are given supporting questions, called scaffolds, or hint text messages • All answers and actions are logged on the server ASSISTment

  5. The Task • To use Bayesian networks to assess students’ knowledge levels in the ASSISTment system and predict their performance on the MCAS test. • Research topic: Compare predictive performance of fine-grain vs. coarse-grain skill models. ASSISTment

  6. Bayesian Networks • "The essence of the Bayesian approach is to provide a mathematical rule explaining how you should change your existing beliefs in the light of new evidence. In other words, it allows scientists to combine new data with their existing knowledge or expertise.” - The Economist (9/30/00) ASSISTment

  7. Bayesian Networks • “New data” • 2,000 students answering questions online • MCAS test results • “Existing knowledge or expertise” • Various grain skill models • Prof. Neil Heffernan • Bayes Rule: Where ‘R’ is a random variable with value ‘r’ and evidence ‘e’ ASSISTment

  8. Platform Selection • Bayesian network software choices: • GeNIe • MSBNx • BayesiaLab • Netica • MATLAB with BNT (Bayes Net Toolkit) • Java Bayes ASSISTment

  9. Platform selection Choice: MATLAB with BNT • Pros: • Provides wide selection of inference engines • MATLAB’s robust programming environment • Automation • Runs on GNU/Linux • Existing Perl interface for the many scripts that will perform data mining tasks. • Cons • Little Slow ASSISTment

  10. Project Overview • The datasets • Skill models • Parameters • Implementation • Results ASSISTment

  11. The Datasets • Student online response data • 600 students from 2004-2005 • Student selection criteria: • Completed at least 100 items online • Completed the 2005 MCAS test • 2,568 question items • Student state MCAS test scores for ’05 • Used for calculating prediction accuracy • No test data used for training/parameter learning ASSISTment

  12. Skill Models • Skill models describe the skills which are related to the online and MCAS questions. • Skill models used: • MCAS1 • MCAS5 • MCAS39 • WPI106 ASSISTment

  13. Skill Models • Skill models used for the MCAS test consisting of 29 multiple choice questions • MCAS1 • MCAS5 ASSISTment

  14. Skill Models • MCAS39 • WPI106 • The MCAS1 is a two layer network with skill nodes mapped to question nodes. The other 3 networks have a third, intermediary layer of ‘AND’ nodes. This allows all question nodes to have the same number of parameters (slip/guess). The ‘AND’ nodes also reflect the notion that a student must know all tagged skills to answer the item correct. ASSISTment

  15. Skill Models Transfer table for skill models Equation-Solving Inequality-solving X-Y-Graph Congruence ASSISTment

  16. Parameters • Parameters were set as a best guess starting point. • Test model guess parameter is 0.25 because questions are multiple choice (out of four) • Preliminary learning of parameters using EM • on the MCAS1 network indicates a guess of • 0.30, slip of 0.38 and prior of 0.44 on the skills. • These numbers were calculated recently and • are not used in our prediction results thus far. ASSISTment

  17. Implementation • The main routine ‘bn_eval()’ takes in: • Name of skill model • StudentID • BNT object of the skill model bayes net • ‘bn_eval()’ outputs: • Status messages • Predicted score/Actual score/Accuracy • Logs prediction and skill assessment data ASSISTment

  18. Implementation • The evaluation is a 2 stage process • Stage 1 • Bayes skill model for the online data is loaded • Student’s online results are compiled and sequenced for the network • Student is given credit for all scaffold questions relating to a top level item answered correctly • Results are entered into the network as evidence • Marginals on the skill nodes are calculated using liklihood_weighting approximate inference . ASSISTment

  19. Implementation • Stage 2 of evaluation • Bayes skill model for the MCAS test is loaded • Skill marginals calculated from stage 1 are entered into the test model as soft evidence • Marginals on the question nodes are calculated using jtree (join-tree) exact inference. • Test score points are summed by multiplying each marginal by 1 and then taking the ceiling of the total score. • Predicted test score is compared to actual student test score. ASSISTment

  20. Implementation • Example student run using MCAS1 model ASSISTment

  21. Implementation • Assessed skill marginals using MCAS1 ASSISTment

  22. Implementation • Example student run using MCAS5 model ASSISTment

  23. Implementation • Assessed skill marginals using MCAS5 ASSISTment

  24. Implementation • Example student run using MCAS39 model ASSISTment

  25. Implementation • Assessed skill marginals using MCAS39 ASSISTment

  26. Implementation • Example student run using WPI106 model ASSISTment

  27. Implementation • Assessed skill marginals using WPI106 ASSISTment

  28. Results • Model performance/accuracy results: • MAD is Mean Average Difference. The test is out of 29 points so a MAD score of 4.5 indicates that the model on average predicts a score that is 4.5 points from the actual score. ASSISTment

  29. Future Work • Reduce runtime • Optimize the number of samples used with liklihood_weighting inference for each model. • Increase accuracy • Learn full parameters in all models • Use analysis to improve skill model tagging • Experiment with alternative models • Combine skill models into a hierarchy • Introduce time as a variable (DBNs) ASSISTment

  30. References A copy of this presentation as well as our initial paper submitted to ITS2006 entitled “Using Fine-Grained Skill Models to Fit Student Performance with Bayesian Networks” can be found online at: http://users.wpi.edu/~zpardos/bayes.html Thanks to the WPI-CS department, Neil Heffernan, contributors at CMU and the ASSISTment developers. ASSISTment

More Related