1 / 43

Contextual Vocabulary Acquisition: From Algorithm to Curriculum

Contextual Vocabulary Acquisition: From Algorithm to Curriculum. William J. Rapaport Department of Computer Science & Engineering Department of Philosophy Center for Cognitive Science Michael W. Kibby Department of Learning & Instruction Center for Literacy & Reading Instruction

esben
Télécharger la présentation

Contextual Vocabulary Acquisition: From Algorithm to Curriculum

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contextual Vocabulary Acquisition:From Algorithm to Curriculum William J. Rapaport Department of Computer Science & Engineering Department of Philosophy Center for Cognitive Science Michael W. Kibby Department of Learning & Instruction Center for Literacy & Reading Instruction StateUniversityof New Yorkat Buffalo NSF ROLE Grant REC-0106338

  2. Definition of “CVA” “Contextual Vocabulary Acquisition” =def • the acquisition of word meanings from text • “incidental” • “deliberate” • by reasoning about • contextual cues • background knowledge • Including hypotheses from prior encounters (if any) with the word) • without external sources of help • No dictionaries • No people

  3. Project Goals • Develop & implement computational theory of CVA based on human protocols (case studies) • Translate algorithms into a curriculum • To improve CVA and reading comprehension in science, technology, engineering, math (“STEM”) • Use new case studies, based on the curriculum, to improve both algorithms & curriculum

  4. People Do “Incidental” CVA • Know more words than explicitly taught • Average high-school senior knows ~40K words  learned ~3K words/school-year (over 12 yrs.) • But only taught a few hundred/school-year  Most word meanings learned from context • “incidentally” (unconsciously) • How?

  5. People Also Do “Deliberate” CVA • You’re reading; • You understand everything you read, until… • You come across a new word • Not in dictionary • No one to ask • So, you try to “figure out”/“learn”/“acquire” its meaning from: • its context • + your background knowledge • How? • guess? derive? infer? deduce? educe? construct? predict? …

  6. What does ‘brachet’ mean?

  7. (From Malory’s Morte D’Arthur [page # in brackets]) 1. There came a white hart running into the hall with a white brachet next to him, and thirty couples of black hounds came running after them. [66] • As the hart went by the sideboard, the white brachet bit him.[66] • The knight arose, took up the brachet and rode away with the brachet.[66] • A lady came in and cried aloud to King Arthur, “Sire, the brachet is mine”.[66] • There was the white brachet which bayed at him fast.[72] 18. The hart lay dead; a brachet was biting on his throat, and other hounds came behind.[86]

  8. Computational cognitive theory of how to learn word meanings • From context • I.e., text + grammatical info + reader’s prior knowledge • With no external sources (human, on-line) • Unavailable, incomplete, or misleading • Domain-independent • But more prior domain-knowledge yields better definitions • “definition” = hypothesis about word’s meaning • Revisable each time word is seen

  9. Cassie learns what “brachet” means:Background info about: harts, animals, King Arthur, etc.No info about: brachetsInput: formal-language version of simplified EnglishA hart runs into King Arthur’s hall.• In the story, B17 is a hart.• In the story, B18 is a hall.• In the story, B18 is King Arthur’s.• In the story, B17 runs into B18.A white brachet is next to the hart.• In the story, B19 is a brachet.• In the story, B19 has the property “white”.• Therefore, brachets are physical objects.(deduced while reading; Cassie believes that only physical objects have color)

  10. -->(defn_noun ’brachet)(CLASS INCLUSION = (PHYS OBJ) structure = nil function = nil actions = (nil) ownership = nil POSSIBLE PROPERTIES = ((WHITE)) synonyms = nil)I.e., a brachet is a physical object that may be white.

  11. A hart runs into King Arthur’s hall.A white brachet is next to the hart.The brachet bites the hart’s buttock.--> (defn_noun ’brachet)(CLASS INCLUSION = (ANIMAL) structure = nil function = nil ACTIONS = ((POSSIBLE ACTIONS = (BITE))) ownership = nil POSSIBLE PROPERTIES = ((WHITE)) synonyms = nil)

  12. A hart runs into King Arthur’s hall. A white brachet is next to the hart. The brachet bites the hart’s buttock. The knight picks up the brachet. The knight carries the brachet. --> (defn_noun ’brachet) (CLASS INCLUSION = (ANIMAL) structure = nil function = nil ACTIONS = ((POSSIBLE ACTIONS = (BITE))) ownership = nil POSSIBLE PROPERTIES = ((SMALL WHITE)) synonyms = nil)

  13. A hart runs into King Arthur’s hall.A white brachet is next to the hart.The brachet bites the hart’s buttock.The knight picks up the brachet.The knight carries the brachet.The lady says that she wants the brachet.--> (defn_noun ’brachet)(CLASS INCLUSION = (ANIMAL) structure = nil function = nil ACTIONS = ((POSSIBLE ACTIONS = (BITE))) ownership = nil POSSIBLE PROPERTIES = ((SMALL VALUABLE WHITE)) synonyms = nil)

  14. A hart runs into King Arthur’s hall.A white brachet is next to the hart.The brachet bites the hart’s buttock.The knight picks up the brachet.The knight carries the brachet.The lady says that she wants the brachet. The brachet bays in the direction of Sir Tor. [background knowledge: only hunting dogs bay] --> (defn_noun ’brachet) ((A BRACHET IS A KIND OF (DOG)) ACTIONS = (POSSIBLE ACTIONS = (BAY BITE)) FUNCTION = (HUNT) structure = nil ownership = nil synonyms = nil)I.e. A brachet is a dog that may bay & bite, and that hunts.

  15. General Comments • System’s behavior  human protocols • System’s definition  OED’s definition: = A brachet is “a kind of hound which hunts by scent”

  16. Computational cognitive theory of how to learn word meanings from context (cont.) • 3 kinds of vocabulary acquisition: • Construct new definition of unknown word • What does ‘brachet’ mean? • Fully revise definition of misunderstood word • Does “smiting” entail killing? • Expand definition of word used in new sense • Can you “dress” (i.e., clothe) a spear? • Initial hypothesis; Revision(s) upon further encounter(s); Converges to stable, dictionary-like definition; Subject to revision

  17. Motivations & Applications • Part of cognitive-science projects • Narrative text understanding • Syntactic semantics (contra Searle’s Chinese-Room Argument) • Computational applications: • Information extraction • Autonomous intelligent agents: • There can be no complete lexicon • Agent/IE-system shouldn’t have to stop to ask questions • Other applications: • L1 & L2 acquisition research • Computational lexicography ** Education: improve reading comprehension **

  18. State of the Art • Vocabulary Learning: • Some dubious contributions: • Useless “algorithms” • Contexts that include definition • Useful contribution: • (good) reader’s word-model = updateable frame with slots & defaults • Psychology: • Cues to look for (= slots for frame): • Space, time, value, properties, functions, causes, classes, synonyms, antonyms • Can understand a word w/o having a definition • Computational Linguistics: • Systems need scripts, human informants, ontologies • Not needed in our system • CVA  Word-Sense Disambiguation • Essay question vs. multiple-choice test

  19. State of the Art: Computational Linguistics • Granger 77: “Foul-Up” • Based on Schank’s theory of “scripts” • Our system not restricted to scripts • Zernik 87: self-extending phrasal lexicon • Uses human informant • Ours system is really “self-extending” • Hastings 94: “Camille” • Maps unknown word to known concept in ontology • Our system can learn new concepts • Word-Sense Disambiguation: • Given ambiguous word & list of all meanings, determine the “correct” meaning • Multiple-choice test  • Our system: given new word, compute its meaning • Essay question 

  20. State of the Art: Vocabulary Learning (I) • Elshout-Mohr/van Daalen-Kapteijns 81,87: • Application of Winston’s AI “arch” learning theory • (Good) reader’s model of new word = frame • Attribute slots, default values • Revision by updating slots & values • Poor readers update by replacing entire frames • But EM & vDK used: • Made-up words • Carefully constructed contexts • Presented in a specific order

  21. Elshout-Mohr & van Daalen-Kapteijns Experiments with neologisms in 5 artificial contexts • When you are used to a view it is depressing when you live in a room with kolpers. • Superordinate information • At home he had to work by artificial light because of those kolpers. • During a heat wave, people want kolpers, so sun-blind sales increase. • Contexts showing 2 differences from the superordinate • I was afraid the room might have kolpers, but plenty of sunlight came into it. • This house has kolpers all summer until the leaves fall out. • Contexts showing 2 counterexamples due to the 2 differences

  22. State of the Art: Psychology • Sternberg et al. 83,87: • Cues to look for (= slots for frame): • Spatiotemporal cues • Value cues • Properties • Functions • Cause/enablement information • Class memberships • Synonyms/antonyms • Johnson-Laird 87: • Word understanding  definition • Definitions aren’t stored

  23. Sternberg • The couple there on the blind date was not enjoying the festivities in the least. An acapnotic, he disliked her smoking; and when he removed his hat, she, who preferred “ageless” men, eyed his increasing phalacrosis and grimaced. • To acquire new words from context: • Distinguish relevant/irrelevant information • Selectively combine relevant information • Compare this information with previous beliefs

  24. State of the Art: Vocabulary Learning (II) Some dubious contributions: • Mueser 84: “Practicing Vocabulary in Context” • BUT: “context” = definition !! • Clarke & Nation 80: a “strategy” (algorithm?) • Look at word & context; determine POS • Look at grammatical context • E.g., “what does what”? • Look at wider context • [E.g., search for Sternberg-like clues] • Guess the word; check your guess

  25. CVA: From Algorithm to Curriculum • “guess the word” = “then a miracle occurs” • Surely, we computer scientists can “be more explicit”!

  26. CVA: From algorithm to curriculum … • Treat “guess” as a procedure call • Fill in the details with our algorithm • Convert the algorithm into a curriculum • To enhance students’ abilities to use deliberate CVA strategies • To improve reading comprehension of STEM texts … and back again! • Use knowledge gained from CVA case studies to improve the algorithm • I.e., use Cassie to learn how to teach humans & use humans to learn how to teach Cassie

  27. Question (objection): Why not use a dictionary? Because: • People are lazy (!) • Dictionaries are not always available • Dictionaries are always incomplete • Dictionary definitions are not always useful • ‘chaste’ =dfpure, clean / “new dishes are chaste” • Most words learned via incidental CVA, not via dictionaries

  28. Question (objection): Teaching computers teaching humans! But: • Our goal: • Not: teach people to “think like computers” • But: to explicate computable & teachable methods to hypothesize word meanings from context • AI as computational psychology: • Devise computer programs that are essentially faithful simulations of human cognitive behavior • Can tell us something about human mind. • We are teaching a machine, to see if what we learn in teaching it can help us teach students better.

  29. Implementation • SNePS (Stuart C. Shapiro & SNeRG): • Intensional, propositional semantic-network knowledge-representation & reasoning system • Node-based & path-based reasoning • I.e., logical inference & generalized inheritance • SNeBR belief revision system • Used for revision of definitions • SNaLPS natural-language input/output • “Cassie”: computational cognitive agent

  30. How It Works • SNePS represents: • background knowledge + text information in a single, consolidated semantic network • Algorithms deductively search network for slot-fillers for definition frame • Search is guided by desired slots • E.g., prefers general info over particular info, but takes what it can get

  31. Noun Algorithm Find or infer: • Basic-level class memberships (e.g., “dog”, rather than “animal”) • else most-specific-level class memberships • else names of individuals • Properties of Ns (else, of individual Ns) • Structure of Ns (else …) • Functions of Ns (else …) • Acts that Ns perform (else …) • Agents that perform acts w.r.t. Ns & the acts they perform (else…) • Ownership • Synonyms Else do: “syntactic/algebraic manipulation” • “Al broke a vase”  a vase is something Al broke • Or: a vase is a breakable physical object

  32. Verb Algorithm • Find or infer: • Predicate structure: • Categorize arguments/cases • Results of V’ing: • Effects, state changes • Enabling conditions for V • Future work: • Classification of verb-type • Synonyms • [Also: preliminary work on adjective algorithm]

  33. Belief Revision • Used to revise definitions of words with different sense from current meaning hypothesis • SNeBR (ATMS; Martins & Shapiro 88): • If inference leads to a contradiction, then: • SNeBR asks user to remove culprit(s) • & automatically removes consequences inferred from culprit • SNePSwD(SNePS w/ Defaults; Martins & Cravo 91) • Currently used to automate step 1, above • AutoBR(Johnson & Shapiro, in progress) • Will replace SNePSwD

  34. Revision & Expansion • Removal & revision being automated via SNePSwD by ranking all propositions with kn_cat: most intrinsic info re: lang; fund. Bkgd info (“before” is transitive) certain story info in text (“King Lot rode to town”) life bkgd info w/o vars or inf (dogs are animals) story-comp info inferred from text (King Lot is a king, rode on a horse) life-rule.1 everyday CS bkgd info (BearsLiveYoung(x)  Mammal(x)) life-rule.2 specialized bkgd info (x smites y  x kills y by hitting y) least certain questionable already-revised life-rule.2; not part of input

  35. Belief Revision: “smite” • Misunderstood word; 2-stage “subtractive” revision • Bkgd info includes: (*) smite(x,y,t)  hit(x,y,t) & dead(y,t) & cause(hit(x,y,t),dead(y,t)) P1: King Lot smote down King Arthur D1: If person x smites person y at time t, then x hits y at t, and y is dead at t Q1: What properties does King Arthur have? R1: King Arthur is dead. P2: King Arthur drew Excalibur. Q2: When did King Arthur do this? • SNeBR is invoked: • KA’s drawing E is inconsistent with being dead • (*) replaced: smite(x,y,t)  hit(x,y,t) & dead(y,t) & [dead(y,t)  cause(hit, dead)] D2: If person x smites person y at time t, then x hits y at t & (y is dead at t) P3: [another passage in which ~(smiting  death)] D3: If person x smites person y at time t, then x hits y at t

  36. Belief Revision: “dress” • “additive” revision • Bkgd info includes: • dresses(x,y)  z[clothing(z) & wears(y,z) • Spears don’t wear clothing (both kn_cat=life.rule.1) P1: King Arthur dressed himself. D1: A person can dress itself; result: it wears clothing. P2: King Claudius dressed his spear. [Cassie infers: King Claudius’s spear wears clothing.] Q2: What wears clothing? • SNeBR is invoked: • KC’s spear wears clothing inconsistent with (2). • (1) replaced: dresses(x,y)  z[clothing(z) & wears(y,z)] v NEWDEF • Replace (1), not (2), because of verb in antecedent of (1) (Gentner) P3: [other passages in which dressing spears precedes fighting] D2: A person can dress a spear or a person; result: person wears clothing or person is enabled to fight

  37. Using SNePS Networks • See handout

  38. Research Methodology • AI team: • Develop, implement, & test better computational theories of CVA • Translate into English for use by reading team • Reading team: • Convert algorithms to curriculum • Think-aloud protocols • To gather new data for use by AI team • As curricular technique (case studies)

  39. Problem in ConvertingAlgorithm into Curriculum • “A knight picks up a brachet and carries it away …” • Cassie: • Has “perfect memory” • Is “perfect reasoner” • Automatically infers that brachet is small • People don’t always realize this: • May need prompting: How big is the brachet? • May need relevant background knowledge • May need help in drawing inferences • Teaching CVA =? teaching general reading comprehension • Vocabulary knowledge correlates with reading comprehension

  40. CVA & Science Education • Original goal: CVA in & for science education • Use CVA to improve reading of STEM materials • A side effect: CVA as science education • There are no ultimate authorities to consult • No answers in the back of the book of life! • As true for STEM as it is for reading about STEM •  Goal of education = • To learn how to learn on one’s own • Help develop confidence & desire to use that skill • CVA as sci. method in miniature furthers this goal: • Find clues/evidence (gathering data) • Integrate them with personal background knowledge • Use together to develop new theory (e.g., new meaning) • Test/revise new theory (on future encounters with word)

  41. Conclusion Developing a computational theory of CVA, which can become … • a useful educational technique for improving [STEM] vocabulary and reading comprehension • a model of the scientific method • a useful tool for learning on one’s own.

  42. Participants

  43. Web Page http://www.cse.buffalo.edu/~rapaport/cva.html

More Related