1 / 78

Input Output 0 0 0 1 0 0 0 1 0 1 1 1

AND Network. Input Output 0 0 0 1 0 0 0 1 0 1 1 1. OR Network. Input Output 0 0 0 1 0 1 0 1 1 1 1 1. NETWORK CONFIGURED BY TLEARN # weights after 10000 sweeps # WEIGHTS # TO NODE 1 -1.9083807468 ## bias to 1 4.3717832565 ## i1 to 1 4.3582129478 ## i2 to 1 0.0000000000.

kendis
Télécharger la présentation

Input Output 0 0 0 1 0 0 0 1 0 1 1 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AND Network Input Output 0 0 01 0 00 1 01 1 1

  2. OR Network Input Output 0 0 01 0 10 1 11 1 1 • NETWORK CONFIGURED BY TLEARN • # weights after 10000 sweeps • # WEIGHTS • # TO NODE 1 • -1.9083807468 ## bias to 1 • 4.3717832565 ## i1 to 1 • 4.3582129478 ## i2 to 1 • 0.0000000000

  3. XOR Network

  4. XOR Network -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1

  5. XOR Network Input Output 0 0 01 0 10 1 01 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1

  6. XOR Network Input Output 0 0 01 0 10 1 01 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2

  7. XOR Network Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2

  8. XOR Network -4.4429202080 ## bias to output 9.0652370453 ## 1 to output 8.9045801163 ## 2 to output Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2

  9. XOR Network Input Output 0 0 01 0 10 1 11 1 1 -4.4429202080 ## bias to output 9.0652370453 ## 1 to output 8.9045801163 ## 2 to output Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2

  10. XOR Network Input Output 0 0 01 0 10 1 11 1 1 -4.4429202080 ## bias to output 9.0652370453 ## 1 to output 8.9045801163 ## 2 to output The mapping from the hidden units to output is an OR network, that never receives a [1 1] input. Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2

  11. The Past Tense and Beyond

  12. Classic Developmental Story • Initial mastery of regular and irregular past tense forms • Overregularization appears only later (e.g. goed, comed) • ‘U-Shaped’ developmental pattern taken as evidence for learning of a morphological rule V + [+past] --> stem + /d/

  13. Rumelhart & McClelland 1986 Model learns to classify regulars and irregulars,based on sound similarity alone.Shows U-shaped developmental profile.

  14. What is really at stake here? • Abstraction • Operations over variables • Learning based on input

  15. What is not at stake here • Feedback, negative evidence, etc.

  16. Who has the most at stake here? • Those who deny the need for rules/variables in language have the most to lose here • …but if they are successful, they bring with them a simple and attractive learning theory, and mechanisms that can readily be grounded at the neural level • However, if the advocates of rules/variables succeed here or elsewhere, they face the more difficult challenge at the neuroscientific level

  17. Questions about Lab 2b • How did the network perform? • How well did the network generalize to novel stems? • What was the effect of the frequency manipulation? • Does the network need to internalize a Blocking Principle? • Does the network explicitly represent a default form?

  18. Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology

  19. Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology

  20. (Pinker & Ullman 2002)

  21. Zero-derived denominals are regular Soldiers ringed the city *Soldiers rang the city high-sticked, grandstanded, … *high-stuck, *grandstood, … Productive in adults & children Shows sensitivity to morphological structure[[ stemN] ø V]-ed Provides good evidence that sound similarity is not everything But nothing prevents a model from using richer similarity metric morphological structure (for ringed) semantic similarity (for low-lifes) Beyond Sound Similarity

  22. Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology

  23. Regulars are productive, need not be stored Irregulars are not productive, must be stored But are regulars immune to effects of associative memory? frequency over-irregularization Pinker & Ullman: regulars may be stored but they can also be generated on-the-fly ‘race’ can determine which of the two routes wins some tasks more likely to show effects of stored regulars Regulars & Associative Memory

  24. Specific Language Impairment Early claims that regulars show greater impairment than irregulars are not confirmed Pinker & Ullman 2002b ‘The best explanation is that language-impaired people are indeed impaired with rules, […] but can memorize common regular forms.’ Regulars show consistent frequency effects in SLI, not in controls. ‘This suggests that children growing up with a grammatical deficit are better at compensating for it via memorization than are adults who acquired their deficit later in life.’ Child vs. Adult Impairments

  25. Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology

  26. Ullman et al. 1997 Alzheimer’s disease patients Poor memory retrieval Poor irregulars Good regulars Parkinson’s disease patients Impaired motor control, good memory Good irregulars Poor regulars Striking correlation involving laterality of effect Marslen-Wilson & Tyler 1997 Normals past tense primes stem 2 Broca’s Patients irregulars prime stems inhibition for regulars 1 patient with bilateral lesion regulars prime stems no priming for irregulars or semantic associates Neuropsychological Dissociations

  27. Lexical Decision Task CAT, TAC, BIR, LGU, DOG press ‘Yes’ if this is a word Priming facilitation in decision times when related word precedes target (relative to unrelated control) e.g., {dog, rug} - cat Marslen-Wilson & Tyler 1997 Regular{jumped, locked} - jump Irregular{found, shows} - find Semantic{swan, hay} - goose Sound{gravy, sherry} - grave Morphological Priming

  28. Bird et al. 2003 complain that arguments for selective difficulty with regulars are confounded with the phonological complexity of the word-endings Pinker & Ullman 2002 weight of evidence still supports dissociation; Bird et al.’s materials contained additional confounds Neuropsychological Dissociations

  29. Jaeger et al. 1996, Language PET study of past tense Task: generate past from stem Design: blocked conditions Result: different areas of activation for regulars and irregulars Is this evidence decisive? task demands very different difference could show up in network doesn’t implicate variables Münte et al. 1997 ERP study of violations Task: sentence reading Design: mixed Result: regulars: ~LAN irregulars: ~N400 Is this evidence decisive? allows possibility of comparison with other violations Brain Imaging Studies

  30. Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology

  31. German Plurals die Straße die Straßendie Frau die Frauen der Apfel die Äpfeldie Mutter die Mütter das Auto die Autosder Park die Parks die Schmidts -s plural low frequency, used for loan-words, denominals, names, etc. Response frequency is not the critical factor in a system that focuses on similarity distribution in the similarity space is crucial similarity space with islands of reliability network can learn islands or network can learn to associate a form with the space between the islands Low-Frequency Defaults

  32. Similarity Space

  33. Similarity Space

  34. Arabic Broken Plural • CvCC • nafs nufuus ‘soul’ • qidh qidaah ‘arrow’ • CvvCv(v)C • xaatam xawaatim ‘signet ring’ • jaamuus jawaamiis ‘buffalo’ • Sound Plural • shuway?ir shuway?ir-uun ‘poet (dim.)’ • kaatib kaatib-uun ‘writing (participle)’ • hind hind-aat ‘Hind (fem. name)’ • ramadaan ramadaan-aat ‘Ramadan (month)’

  35. German Plurals (Hahn & Nakisa 2000)

  36. Syntax, Semantics, & Statistics

  37. Starting Small Simulation • How well did the network perform? • How did it manage to learn?

  38. Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ?

  39. Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ? 1 1 1 1 (Humans) 1 1 1 0 (Network)

  40. Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ? • Generalization fails because learning is local 1 1 1 1 (Humans) 1 1 1 0 (Network)

  41. Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ? • Generalization succeeds because representations are shared 1 1 1 1 (Humans) 1 1 1 1 (Network)

More Related