1 / 67

Abstract Neuron

o u t p u t y. {. 1 if net > 0 0 otherwise. w 0. i 0 =1. w 1. w 2. w n. i 1. i 2. i n. i n p u t i. Abstract Neuron. Link to Vision: The Necker Cube. Constrained Best Fit in Nature. inanimate animate. Computing other relations.

lslater
Télécharger la présentation

Abstract Neuron

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. o u t p u t y { 1 if net > 0 0 otherwise w0 i0=1 w1 w2 wn . . . i1 i2 in i n p u t i Abstract Neuron

  2. Link to Vision: The Necker Cube

  3. Constrained Best Fit in Nature inanimate animate

  4. Computing other relations • The 2/3 node is a useful function that activates its outputs (3) if any (2) of its 3 inputs are active • Such a node is also called a triangle node and will be useful for lots of representations.

  5. Triangle nodes and McCullough-Pitts Neurons? Relation (A) Object (B) Value (C) A B C

  6. “They all rose” triangle nodes: when two of the abstract neurons fire, the third also fires model of spreading activation

  7. Basic Ideas • Parallel activation streams. • Top down and bottom up activation combine to determine the best matching structure. • Triangle nodes bind features of objects to values • Mutual inhibition and competition between structures • Mental connections are active neural connections

  8. Behavioral Experiments • Identity – Mental activity is Structured Neural Activity • Spreading Activation— Psychological model/theory behind priming and interference experiments • Simulation — Necessary for meaningfulness and contextual inference • Parameters — Govern simulation, strict inference, link to language

  9. Bottom-up vs. Top-down Processes • Bottom-up: When processing is driven by the stimulus • Top-down: When knowledge and context are used to assist and drive processing • Interaction: The stimulus is the basis of processing but almost immediately top-down processes are initiated

  10. Stroop Effect • Interference between form and meaning

  11. Name the words BookCarTableBoxTrashManBed CornSitPaperCoin Glass HouseJar KeyRugCatDoll Letter BabyTomato CheckPhone Soda DishLampWoman

  12. Name the print color of the words BlueGreenRed YellowOrangeBlackRed PurpleGreenRedBlueYellowBlackRed GreenWhiteBlueYellow Red BlackBlue WhiteRed Yellow GreenBlackPurple

  13. Procedure for experiment that demonstrates the word-superiority effect. First the word is presented, then the XXXX’s, then the letters.

  14. Word-Superiority Effect Reicher (1969) • Which condition resulted in faster & more accurate recognition of the letter? • The word condition • Letters are recognized faster when they are part of a word then when they are alone • This rejects the completely bottom-up feature model • Also a challenge for serial processing

  15. Connectionist ModelMcClelland & Rumelhart (1981) • Knowledge is distributed and processing occurs in parallel, with both bottom-up and top-down influences • This model can explain the Word-Superiority Effect because it can account for context effects

  16. Connectionist Model of Word Recognition

  17. Interaction in language processing: Pragmatic constraints on lexical access Jim Magnuson Columbia University

  18. Information integration • A central issue in psycholinguistics and cognitive science: • When/how are such sources integrated? • Two views • Interaction • Use information as soon as it is available • Free flow between levels of representation • Modularity • Protect and optimize levels by encapsulation • Staged serial processing • Reanalyze / appeal to top-down information only when needed

  19. Reaction Times in Milliseconds after: “They all rose” 0 delay 200ms. delay

  20. Example: Modularity and word recognition • Tanenhaus et al. (1979) [also Swinney, 1979] • Given a homophone likerose, and a context biased towards one sense, when is context integrated? • Spoken sentence primes ending in homophones: • They all rose vs. They bought a rose • Secondary task: name a displayed orthographic word • Probe at offset of ambiguous word: priming for both“stood” and “flower” • 200 ms later: only priming for appropriate sense • Suggests encapsulation followed by rapid integration • But the constraint here is weak -- overestimates modularity? • How could we examine strong constraints in natural contexts?

  21. Allopenna, Magnuson & Tanenhaus (1998) Eye Eye camera tracking computer Scene camera ‘Pick up the beaker’

  22. TRACE predictions Do rhymes compete? • Cohort (Marlsen-Wilson): onset similarity is primary because of the incremental nature of speech (serial/staged; Shortlist/Merge) • Cat activates cap, cast, cattle, camera, etc. • Rhymes won’t compete • NAM (Neighborhood Activation Model; Luce): global similarity is primary • Cat activatesbat, rat, cot, cast, etc. • Rhymes among set of strong competitors • TRACE (McClelland & Elman): global similarity constrained by incremental nature of speech • Cohorts and rhymes compete, but with different time course

  23. Allopenna et al. Results

  24. Study 1 Conclusions • As predicted by interactive models, cohorts and rhymes are activated, with different time courses • Eye movement paradigm • More sensitive than conventional paradigms • More naturalistic • Simultaneous measures of multiple items • Transparently linkable to computational model • Time locked to speech at a fine grain

  25. Theoretical conclusions • Natural contexts provide strong constraints that are used • When those constraints are extremely predictive, they are integrated as quickly as we can measure • Suggests rapid, continuous interaction among • Linguistic levels • Nonlinguistic context • Even for processes assumed to be low-level and automatic • Constrains processing theories, also has implications for, e.g., learnability

  26. Producing words from pictures or from other words: A comparison of aphasic lexical access from two different input modalities Gary Dell with Myrna Schwartz, Dan Foygel, Nadine Martin, Eleanor Saffran, Deborah Gagnon, Rick Hanley, Janice Kay, Susanne Gahl, Rachel Baron, Stefanie Abel, Walter Huber

  27. Boxes and arrows in the linguistic system Semantics Syntax Lexicon Output Phonology Input Phonology

  28. Picture Naming Task Semantics Say: “cat” Syntax Lexicon Output Phonology Input Phonology

  29. A 2-step Interactive Model of Lexical Access in Production Semantic Features FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  30. Step 1 – Lemma Access Activate semantic features of CAT FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  31. Step 1 – Lemma Access Activation spreads through network FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  32. Step 1 – Lemma Access Most active word from proper category is selected and linked to syntactic frame NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  33. Step 2 – Phonological Access Jolt of activation is sent to selected word NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  34. Step 2 – Phonological Access Activation spreads through network NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  35. Step 2 – Phonological Access Most activated phonemes are selected FOG DOG CAT RAT MAT Syl On Vo Co f r d k m ae o t g Onsets Vowels Codas

  36. Semantic Error – “dog” Shared features activate semantic neighbors NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  37. Formal Error – “mat” Phoneme-word feedback activates formal neighbors NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  38. Mixed Error – “rat” Mixed semantic-formal neighbors gain activation from both top-down and bottom-up sources NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas

  39. Errors of Phonological Access- “dat” “mat” Selection of incorrect phonemes FOG DOG CAT RAT MAT Syl On Vo Co f r d k m ae o t g Onsets Vowels Codas

  40. A Test of the Model:Picture-naming Errors in Aphasia “cat” 175 pictures of concrete nouns–Philadelphia Naming Test 94 patients (Broca,Wernicke, anomic, conduction) 60 normal controls

  41. Response Categories Correct Semantic Formal Mixed Unrelated Nonword CATDOG MAT RAT LOG DAT Continuity Thesis: Normal Error Pattern: 97% Correct Random Error Pattern: 80% Nonwords cat dog mat rat log dat cat dog mat rat log dat

  42. Implementing the Continuity Thesis 2.Set processing parameters of the model so that its error pattern matches the normal controls. Random Pattern Model Random Pattern cat dog mat rat log dat Normal Controls Model Normal Pattern 1.Set up the model lexicon so that when noise is very large, it creates an error pattern similar to the random pattern. cat dog mat rat log dat

  43. Lesioning the model: The semantic-phonological weight hypothesis Semantic Features Semantic-word weight: S FOG DOG CAT RAT MAT Phonological- word weight: P f r d k m ae o t g Onsets Vowels Codas

  44. Patient CAT DOG MAT RAT LOG DAT Correct Semantic Formal Mixed Unrelated Nonword LH .71 .03 .07 .01 .02 .15 s=.024 p=.018.69 .06 .06 .01 .02 .17 IG .77 .10 .06 .03 .01 .03 s=.019 p=.032.77 .09 .06 .01 .04 .03 GL .29 .04 .22 .03 .10 .32 s=.010 p=.016.31 .10 .15 .01 .13 .30

  45. Representing Model-Patient Deviations Root Mean Square Deviation (RMSD) LH .016 IG .016 GL .043

  46. 94 new patients—no exclusions 94.5 % of variance accounted for

  47. Conclusions The logic underlying box-and-arrow- models is perfectly compatible with connectionist models. Connectionist principles augment the boxes and arrows with -- a mechanism for quantifying degreeof damage -- mechanisms for error types and hence an explanation of the error patterns Implications for recovery and rehabilitation

More Related