1 / 42

Constructing Grammar: a computational model of the acquisition of early constructions

Constructing Grammar: a computational model of the acquisition of early constructions. CS 182 Lecture April 25, 2006. What constitutes learning a language?. What are the sounds (Phonology) How to make words (Morphology) What do words mean (Semantics)

regis
Télécharger la présentation

Constructing Grammar: a computational model of the acquisition of early constructions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Constructing Grammar:a computational model of the acquisition of early constructions CS 182 Lecture April 25, 2006

  2. What constitutes learning a language? • What are the sounds (Phonology) • How to make words (Morphology) • What do words mean (Semantics) • How to put words together (Syntax) • Social use of language (Pragmatics) • Rules of conversations (Pragmatics)

  3. What do we know about language development? (focusing mainly on first language acquisition of English-speaking, normal population)

  4. 0 mos 6 mos 12 mos 2 yr 3 yrs 4 yrs 5 yrs Children are amazing learners cooing first word reduplicated babbling two-word combinations multi-word utterances questions, complex sentence structures, conversational principles

  5. Phonology: Non-native contrasts • Werker and Tees (1984) • Thompson: velar vs. uvular, /`ki/-/`qi/. • Hindi: retroflex vs. dental, /t.a/-/ta/

  6. pretty baby Finding words: Statistical learning • Saffran, Aslin and Newport (1996) • /bidaku/, /padoti/, /golabu/ • /bidakupadotigolabubidaku/ • 2 minutes of this continuous speech stream • By 8 months infants detect the words (vs non-words and part-words)

  7. Hirsch-Pasek and Golinkoff (1996) 1;4-1;7 mostly still in the one-word stage Where is CM tickling BB? Word order: agent and patient

  8. Early syntax • agent + action ‘Daddy sit’ • action + object ‘drive car’ • agent + object ‘Mommy sock’ • action + location ‘sit chair’ • entity + location ‘toy floor’ • possessor + possessed ‘my teddy’ • entity + attribute ‘crayon big’ • demonstrative + entity ‘this telephone’

  9. From Single Words To Complex Utterances FATHER: Nomi are you climbing up the books? NAOMI: up. NAOMI: climbing. NAOMI: books. 1;11.3 FATHER: what’s the boy doing to the dog? NAOMI: squeezing his neck. NAOMI: and the dog climbed up the tree. NAOMI: now they’re both safe. NAOMI: but he can climb trees. 4;9.3 MOTHER: what are you doing? NAOMI: I climbing up. MOTHER: you’re climbing up? 2;0.18 Sachs corpus (CHILDES)

  10. Gold’s Theorem: No superfinite class of language is identifiable in the limit from positive data only Principles & Parameters Babies are born as blank slates but acquire language quickly (with noisy input and little correction) → Language must be innate: Universal Grammar + parameter setting But babies aren’t born as blank slates! And they do not learn language in a vacuum! How Can Children Be So Good At Learning Language?

  11. Modeling the acquisition of grammar: Theoretical assumptions

  12. Language Acquisition • Opulence of the substrate • Prelinguistic children already have rich sensorimotor representations and sophisticated social knowledge • intention inference, reference resolution • language-specific event conceptualizations (Bloom 2000, Tomasello 1995, Bowerman & Choi, Slobin, et al.) • Children are sensitive to statistical information • Phonological transitional probabilities • Even dependencies between non-adjacent items (Saffran et al. 1996, Gomez 2002)

  13. throw frisbee get ball this should be reminiscent of your model merging assignment throw ball get bottle … … get OBJECT throw OBJECT Language Acquisition • Basic Scenes • Simple clause constructions are associated directly with scenes basic to human experience (Goldberg 1995, Slobin 1985) • Verb Island Hypothesis • Children learn their earliest constructions (arguments, syntactic marking) on a verb-specific basis (Tomasello 1992)

  14. Comprehensionispartial. (not just for dogs)

  15. What children pick up from what they hear • Children use rich situational context / cues to fill in the gaps • They also have at their disposal embodied knowledge and statistical correlations (i.e. experience) what did you throw it into? they’re throwing this in here. they’re throwing a ball. don’t throw it Nomi. well you really shouldn’t throw things Nomi you know. remember how we told you you shouldn’t throw things. what did you throw it into? they’re throwing this inhere. they’re throwing a ball. don’t throw it Nomi. wellyou really shouldn’t throw things Nomi you know. remember how we told you you shouldn’t throw things.

  16. Language Learning Hypothesis Children learn constructionsthat bridge the gap between what they know from language and what they know from the rest of cognition

  17. Modeling the acquisition of (early) grammar: Comprehension-driven, usage-based

  18. Embodied Construction Grammar (Bergen and Chang 2005) construction THROWER-THROW-OBJECT constructional constituents t1 : REF-EXPRESSION t2 : THROW t3 : OBJECT-REF form t1f before t2f t2f before t3f meaning t2m.thrower ↔t1m t2m.throwee ↔t3m role-filler bindings

  19. Analyzing “You Throw The Ball” MEANING (stuff) FORM (sound) Thrower-Throw-Object t1 before t2 t2 before t3 t2.thrower ↔ t1 t2.throwee ↔ t3 schema Addressee subcase of Human Addressee you “you” Throw thrower throwee schema Throw roles: thrower throwee throw “throw” “the” Ball schema Ball subcase of Object ball “ball” schema Block subcase of Object “block” block

  20. 1. Learner passes input (Utterance + Situation) and current grammar to Analyzer. Constructions (Utterance, Situation) 2. Analyzer produces SemSpec and Constructional Analysis. Analyze Hypothesize Semantic Specification,Constructional Analysis Learning-Analysis Cycle (Chang, 2004) Reorganize • Learner updates grammar: a. Hypothesize new map. b. Reorganize grammar (merge or compose). c. Reinforce (based on usage).

  21. Hypothesizing a new construction through relational mapping

  22. Initial Single-Word Stage lexical constructions MEANING (stuff) FORM (sound) schema Addressee subcase of Human “you” you schema Throw roles: thrower throwee throw “throw” ball “ball” schema Ball subcase of Object schema Block subcase of Object block “block”

  23. throw-ball role-filler before New Data: “You Throw The Ball” FORM MEANING SITUATION Self schema Addressee subcase of Human you Addressee Addressee “you” schema Throw roles: thrower throwee Throw thrower throwee Throw thrower throwee throw “throw” “the” schema Ball subcase of Object ball Ball Ball “ball” schema Block subcase of Object “block” block

  24. New Construction Hypothesized construction THROW-BALL constructional constituents t : THROW b : BALL form tf before bf meaning tm.throwee ↔ bm

  25. Three kinds of meaning relations • When B.m fills a role of A.m • When A.m and B.m are both filled by X • When A.m and B.m both fill roles of X throwballthrow.throwee ↔ ball put ball downput.mover ↔ balldown.tr↔ ball Nomiballpossession.possessor ↔ Nomipossession.possessed ↔ ball

  26. Reorganizing the current grammar through merge and compose

  27. throw the block THROW-OBJECT THROW.throwee = Objectm throw before Objectf throw-ing the ball Merging Similar Constructions Throw.throwee = Block throw before block throw before ball Throw.throwee = Ball Throw.aspect = ongoing throw before-s ing

  28. Resulting Construction construction THROW-OBJECT constructional constituents t : THROW o : OBJECT form tf before of meaning tm.throwee ↔ om

  29. throw the ball Throw.throwee = Ball throw before ball THROW.throwee = Ball Motion m m.mover = Ball m.path = Off THROW-BALL-OFF throw before ball ball before off Motion m m.mover = Ball m.path = Off ball before off ball off Composing Co-occurring Constructions

  30. Resulting Construction construction THROW-BALL-OFF constructional constituents t : THROW b : BALL o : OFF form tf before bf bf before of meaning evokes MOTION as m tm.throwee ↔ bm m.mover ↔ bm m.path ↔ om

  31. Precisely defining the learning algorithm

  32. Language Learning Problem • Prior knowledge • Initial grammar G (set of ECG constructions) • Ontology (category relations) • Language comprehension model (analysis/resolution) • Hypothesis space: new ECG grammar G’ • Search = processes for proposing new constructions • Relational Mapping, Merge, Compose

  33. Language Learning Problem • Performance measure • Goal: Comprehension should improve with training • Criterion: need some objective function to guide learning… Probability of Model given Data: Minimum Description Length:

  34. Minimum Description Length • Choose grammar G to minimize cost(G|D): • cost(G|D) = α • size(G) + β • complexity(D|G) • Approximates Bayesian learning; cost(G|D) ≈ posterior probability P(G|D) • Size of grammar = size(G) ≈ prior P(G) • favor fewer/smaller constructions/roles; isomorphic mappings • Complexity of data given grammar ≈ likelihood P(D|G) • favor simpler analyses(fewer, more likely constructions) • based on derivation length + score of derivation

  35. Size Of Grammar • Size of the grammar G is the sum of the size of each construction: • Size of each construction c is: where • nc = number of constituents in c, • mc = number of constraints in c, • length(e) = slot chain length of element reference e

  36. Example: The Throw-Ball Cxn construction THROW-BALL constructional constituents t : THROW b : BALL form tf before bf meaning tm.throwee ↔ bm size(THROW-BALL) = 2 + 2 + (2 + 3) = 9

  37. Complexity of Data Given Grammar • Complexity of the data D given grammar G is the sum of the analysis score of each input token d: • Analysis score of each input token d is: where • c is a construction used in the analysis of d • weightc ≈ relative frequency of c, • |typer| = number of ontology items of type r used, • heightd= height of the derivation graph, • semfitd= semantic fit provide by the analyzer

  38. Preliminary Results

  39. Experiment: Learning Verb Islands • Subset of the CHILDES database of parent-child interactions (MacWhinney 1991; Slobin et al.) • coded by developmental psychologists for • form: particles, deictics, pronouns, locative phrases, etc. • meaning: temporality, person, pragmatic function,type of motion (self-movement vs. caused movement; animate being vs. inanimate object, etc.) • crosslinguistic (English, French, Italian, Spanish) • English motion utterances: 829 parent, 690 child utterances • English all utterances: 3160 adult, 5408 child • age span is 1;2 to 2;6

  40. Learning Throw-Constructions

  41. Learning Results

  42. Summary • Cognitively plausible situated learning processes • What do kids start with? • perceptual, motor, social, world knowledge • meanings of single words • What kind of input drives acquisition? • Social-pragmatic knowledge • Statistical properties of linguistic input • What is the learning loop? • Use existing linguistic knowledge to analyze input • Use social-pragmatic knowledge to understand situation • Hypothesize new constructions to bridge the gap

More Related