1 / 14

Experimenter-defined measures in a Reading Tutor that Listens

Experimenter-defined measures in a Reading Tutor that Listens. Jack Mostow Project LISTEN ( www.cs.cmu.edu/~listen ) Carnegie Mellon University Funding: National Science Foundation IERI PI Meeting, Sept. 9-10, 2004. Tutoring: Dr. Joseph Beck, mining tutorial data

efrazier
Télécharger la présentation

Experimenter-defined measures in a Reading Tutor that Listens

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experimenter-defined measures in a Reading Tutor that Listens • Jack Mostow • Project LISTEN (www.cs.cmu.edu/~listen) • Carnegie Mellon University • Funding: National Science Foundation • IERI PI Meeting, Sept. 9-10, 2004 1

  2. Tutoring: Dr. Joseph Beck, mining tutorial data Prof. Albert Corbett, cognitive tutors Prof. Rollanda O’Connor, reading Prof. Kathy Ayres, stories for children Joe Valeri, activities and interventions Becky Kennedy, linguist Listening: Dr. Mosur Ravishankar, recognizer Dr. Evandro Gouvea, acoustic training John Helman, transcriber Programmers: Andrew Cuneo, application Karen Wong, Teacher Tool Field staff: Dr. Roy Taylor Kristin Bagwell Julie Sleasman Grad students: Hao Cen, HCI Cecily Heiner, MCALL Peter Kant, Education Shanna Tellerman, ETC Plus: Advisory board Research partners DePaul UBC U. Toronto Schools Thanks to fellow LISTENers 2

  3. Uses of experimenter-defined measures • Analyze usage • Assess students • Evaluate interventions 3

  4. 1. Analyzing usage • Participation • What % of enrolled students use the Reading Tutor? • Frequency • How often do they use it? • Duration • For how long at a time? • Hiatus rate • With how many timeouts? • Crash rate • With what % ending in crashes? 4

  5. 2002-2003 usage: lab > classroom-- but top classrooms > average lab 5

  6. 2. Assessing students: Latency decreases over time • Initial encounter of muttered: • I’ll have to mop up all this (5630) muttered Dennis to himself but how • 5 weeks later: • Dennis (110) muttered oh I forgot to ask him for the money • How does latency evolve in general? 6

  7. Word latency (in ms) Learning Curves for help and latency(for 2.4 million word encounters in ’00-01) • Help request rate # of previous encounters of the word Help and latency predict WI scores with correlation > 0.9. 7

  8. Time as an indicator of motivation • % time picking stories predicts lower gains (R = –0.4) • Mostow, J., Aist, G., Beck, J., Chalasani, R., Cuneo, A., Jia, P., & Kadaru, K. (2002, June 5-7). A La Recherche du Temps Perdu, or As Time Goes By: Where does the time go in a Reading Tutor that listens? Proceedings of the Sixth International Conference on Intelligent Tutoring Systems (ITS'2002), Biarritz, France, 320-329. • Hasty responses indicate guessing • Beck, J. E. (2004, August 31). Using response times to model student disengagement.Proceedings of the ITS2004 Workshop on Social and Emotional Intelligence in Learning Environments, Maceio, Brazil. 8

  9. 3. Evaluating interventions: vocabulary[Aist PhD thesis] • Randomly explain some new words but not others. • Test each new word the next day. • Did kids do better on explained vs. unexplained words? • Overall: NO; 38%  36%, N = 3,171 trials • Rare, 1-sense words tested 1-2 days later: YES! 44% >> 26%, N = 189. 9

  10. Evaluating interventions: word help[SSSR04] Student isreading a story Student needs help on a word Tutor chooses what help to give Student continues reading Time passes… Student sees word in a later sentence • How does outcome vary by help, word, and delay? ‘People sit down and …’ Student clicks ‘read.’ Decision (randomized) ‘… read a book.’ ‘I love to read stories.’ Outcome: read fluently? 10

  11. Best overall: Rhymes With 69.2% ± 0.4% Worst overall: Recue 55.6% ± 0.4% Compare within level to control for word difficulty. Supplying the word helped best in the short term… But rhyming hints had longer lasting benefits. What helped which words best, when?(for 270 students, 180,909 randomized trials) 11

  12. 2003-2004 database: 9 schools > 200 computers > 50,000 sessions > 1.5M tutor responses > 10M words recognized Embedded experiments Randomized trials See videos, papers, etc. at www.cs.cmu.edu/~listen. Thank you! Questions? Project LISTEN’s Reading Tutor: A rich source of experimental data 12

  13. Project LISTEN’s Reading Tutor (video) • John Rubin (2002). The Sounds of Speech (Show 3). On Reading Rockets (Public Television series commissioned by U.S. Department of Education). Washington, DC: WETA. • Available at www.cs.cmu.edu/~listen. 13

  14. Map data stream to data set: trials • Context: Decision: Outcome: 14

More Related