1 / 36

In-vivo research on learning

Dive into the world of in-vivo experiments in learning with insights on features, studies, and challenges faced by researchers. Explore how instructional hypotheses are mapped to interventions and the importance of promoting active processing for improved outcomes in education.

ncrowley
Télécharger la présentation

In-vivo research on learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. In-vivo research on learning Charles Perfetti PSLC Summer School 2009

  2. In-vivo experiments • In Vitro • In Vivo

  3. Features of in-vivo experiments in learning • “On-Line” course? • An Intelligent Tutoring System? • A real class; real students; an intervention that counts.

  4. The value of in-vivo experiments in learning • Noisy, uncontrolled environment • Content of intervention is validated by course goals • So: Built in generalization to classroom learning

  5. Problems faced by an in-vivo researcher • Noisy, uncontrolled environment • As for your experiment: • Students have other things to do • Instructors have other things to do

  6. Examples of in-vivo studies • Algebra, Physics, Chemistry, Geometry, French, Chinese,English • Some with computer tutors in major role • ITS • Practice tutors • Some without tutors or tutors in minor role

  7. Pre-requisites for an in-vivo experiment • Knowledge components analysis • Mapping of KCA to a learning or instructional hypothesis • Theory based • Empirical precedent • Mapping instructional hypothesis to specific intervention

  8. Knowledge Components vs. curriculum topic Single Topic (Area) as unit 12 separate KCs as units Enabled by Data Shop

  9. Whole Character = early morning Radical = sun zao3 Mapping a KCA onto an instructional hypothesis • The case of Chinese characters

  10. Mapping an instructional hypothesis to an instructional intervention • Learning event space

  11. Instructional Event Space Learning Events Instructional Events Assessment Events Explicit or implicit Focus on Valid Features Make Knowledge Accessible Promote Active Processing Schedule events effectively Coordinate multiple events Performance

  12. Knowledge Components Analysis 2 (+2) Knowledge Components: the character as a whole; (plus its meaning) the radical that is part of the character (plus its meaning) Two approaches based on this analysis (1) Dunlap, Liu, & Perfetti; (2) Pavlik Two different Instructional Events manipulations Illustrate 1 here: Feature focus Whole Character = early morning Radical = sun zao3 1. Learning meanings of Chinese characters

  13. Early morning Instructional Event Space Associate character form with meaning Assessment Events Whole Character means x Default (typical) Instructional event Performance

  14. Early morning Instructional Event Space Associate radical with x’ and whole character with x Part of character means x’ Assessment Events Highlighted radical = sun/day Performance Dunlap et al: Instructional event manipulation: semantic radical instruction

  15. Learning English Spelling (Background knowledge and feature focusing themes) Dunlap, Juffs, Friedline, Perfetti

  16. KC analysis of English spelling • phonology—orthography • /breit/--brate • /hiyl/--heel • /hiyl/--heal • So: phonology-semantics-orthography

  17. Feature focusing interventions • 130 students in levels 3 4, & 5 • Interventions: • “Pure” feature focus: form only (pronunciation-spelling pairs) • Meaning mediated focus: form + meaning (pronunciation-meaning-spelling triads) • 7 sessions, 30 minutes per session over 7 weeks

  18. Dunlap, Juffs, Friedline, Perfetti

  19. Control conditions for in-vivo experiments • Typical control conditions • Existing classroom instruction • Textbook & exercise problems • For cog tutors: • Another tutoring system • Human tutoring • A control intervention; • 2 plausible interventions—which is more effective

  20. Learning Assessments • Immediate Learning • Long-term retention • Transfer • Over content, form, testing situations • Accelerated Future Learning • New content; learning measure

  21. Instructional Event Space Learning Events Instructional Events Assessment Events Explicit or implicit Focus on Valid Features Make Knowledge Accessible Promote Active Processing Schedule events effectively Coordinate multiple events • Learning • Long term retention • Transfer • Accelerated future learning Performance

  22. Transfer illustrated: Liu, Wang, Perfetti Chinese tone perception study • In-vivo study • Traditional classroom (not online) • Materials from students’ textbook • New materials each week for 8 weeks of term 1 • Term 2 continued this, and added novel syllables unfamiliar to the student • 3 instructional conditions • tone number + pin yin, contour + pin-yin; contour only • Hint system • (CTAT) Tutors presented materials in 3 different instructional interfaces, according to the 3 conditions • Data shop logged individual student data

  23. Illustration of 2 conditions from Liu et al shi

  24. Data from Liu et al tone study Learning curves week-by-week

  25. Multiple kinds of transfer • Liu et al shows 2 kinds of materials transfer • Within term 1, learning sessions, each syllable to be learned was different but familiar. So transfer of learning to familiar items • At second term, there were unfamiliar syllables. So transfer of learning to unfamiliar items. (Not so good.)

  26. Accelerated future learning Score Ordinary transfer Pre Post Physics Training Example of acceleration of future learning (Min Chi & VanLehn) • First probability, then physics. During probability only, • Half students taught an explicit strategy • Half not taught a strategy (normal instruction) Score Pre Post Probability Training

  27. Creating assessments • General strategy: • Guided by cognitive task analysis (pre-test as well) including learning goals and specific knowledge components • Include some items from the pre-test • Check for basic learning • Some items similar to training items • Measures near-transfer • Some problems dissimilar to training problems • Measures far-transfer

  28. Mistakes to avoid in test design • Tests that are • Too difficult • Too easy • Too long • Tests that • Fail to represent instructed content • Missing content; over sampling from some content • Depend too much on background knolwedge Notice problems in test means Notice variances

  29. Interpreting test results as learning • Post-test in relation to pre-test. 2 strategies: • ANOVA on • gain scores • First check pre-test equivalence • Not recommended if pre-tests not equivalent • Pre-test, post test as within-subjects variable (t-tests for non-independent samples) • ANCOVA. Post-tests scores are dependent variable; pre-test scores are co-variate

  30. Plot learning results • Bar graphs for instructional conditions • Differences due to conditions • Learning Curves • Growth over time/instruction

  31. Bar graphs (with error bars!)

  32. Learning Curves Error rate Weekly sessions over 2 terms

  33. Learning Curves Error rate Weekly sessions over 2 terms PSLC summer school 2009 38

  34. A final word on experiments In-vivo limitations The role of (in-vitro) laboratory studies

  35. The end

More Related