1 / 56

Abstract Neuron

o u t p u t y. {. 1 if net > 0 0 otherwise. w 0. i 0 =1. w 1. w 2. w n. . . . i 1. i 2. i n. i n p u t i. Abstract Neuron. Computing with Abstract Neurons. McCollough-Pitts Neurons were initially used to model pattern classification

roy
Télécharger la présentation

Abstract Neuron

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. o u t p u t y { 1 if net > 0 0 otherwise w0 i0=1 w1 w2 wn . . . i1 i2 in i n p u t i Abstract Neuron

  2. Computing with Abstract Neurons • McCollough-Pitts Neurons were initially used to model • pattern classification • size = small AND shape = round AND color = green AND Location = on_tree => Unripe_fruit • linking classified patterns to behavior • size = large OR motion = approaching => move_away • size = small AND location = above => move_above • McCollough-Pitts Neurons can compute logical functions. • AND, NOT, OR

  3. i1 w01 w02 i2 y0 b=1 w0b x0 f Computing logical functions: the OR function • Assume a binary threshold activation function. • What should you set w01, w02 and w0b to be so that you can get the right answers for y0?

  4. i2 i1 Many answers would work y = f (w01i1 + w02i2 + w0bb) recall the threshold function the separation happens when w01i1 + w02i2 + w0bb = 0 move things around and you get i2 = - (w01/w02)i1 - (w0bb/w02)

  5. Decision Hyperplane • The two classes are therefore separated by the `decision' line which is defined by putting the activation equal to the threshold. • It turns out that it is possible to generalise this result to Threshold Units with n inputs. • In 3-D the two classes are separated by a decision-plane. • In n-D this becomes a decision-hyperplane.

  6. Linearly separable patterns PERCEPTRON is an architecture which can solve this type of decision boundary problem. An "on" response in the output node represents one class, and an "off" response represents the other. Linearly Separable Patterns

  7. The XOR function

  8. The Input Pattern Space

  9. The Decision planes

  10. Multiple Layers y 0.5 1 -1 0.5 1.5 1 1 1 1 I1 I2

  11. Multiple Layers y 0.5 1 -1 0.5 1.5 1 1 1 1 I1 I2 0 1

  12. Multiple Layers y 0.5 1 -1 0.5 1.5 1 1 1 1 I1 I2 1 1

  13. Computing other relations • The 2/3 node is a useful function that activates its outputs (3) if any (2) of its 3 inputs are active • Such a node is also called a triangle node and will be useful for lots of representations.

  14. Triangle nodes and McCullough-Pitts Neurons? Relation (A) Object (B) Value (C) A B C

  15. “They all rose” triangle nodes: when two of the abstract neurons fire, the third also fires model of spreading activation

  16. Basic Ideas behind the model • Parallel activation streams. • Top down and bottom up activation combine to determine the best matching structure. • Triangle nodes bind features of objects to values • Mutual inhibition and competition between structures • Mental connections are active neural connections

  17. 5levels of Neural Theory of Language Pyscholinguistic experiments Cognition and Language Computation Structured Connectionism abstraction Neural Net and learning Triangle Nodes Computational Neurobiology Biology Neural Development Quiz Midterm Finals

  18. Psychological Studies Eva Mok CS182/CogSci110/Ling109 Spring 2006

  19. Read the list ORANGE BROWN GREEN YELLOW BLUE RED

  20. Name the print color XXXXX XXXXX XXXXX XXXXX XXXXX XXXXX

  21. Name the print color RED GREEN BLUE BROWN ORANGE YELLOW

  22. Form and meaning interact in comprehension, production and learning The Stroop Test

  23. Top down and bottom up information • Bottom-up: stimulus driving processing • Top-down: knowledge and context driving processing • When are these information integrated? • Modular view: Staged serial processing • Interaction view: Information is used as soon as available

  24. Tanenhaus et al. (1979) [also Swinney, 1979] Word / non-word forced choice

  25. Modeling the task with triangle nodes

  26. Reaction times in milliseconds after: “They all rose” 0 delay 200ms. delay (facilitation) (no facilitation) (facilitation) (facilitation)

  27. When is context integrated? • Prime: spoken sentences ending in homophones They all rose vs. They bought a rose • Probe: stood and flower • No offset: primes both stood and flower • 200 ms offset: only primes appropriate sense • Modularity? Or weak contextual constraints?

  28. Allopenna, Magnuson & Tanenhaus (1998) Eye tracking computer Eye camera Scene camera “Pick up the beaker” Adapted from Jim Magnuson, “Interaction in language processing: Pragmatic constraints on lexical access”

  29. Do rhymes compete? • Cohort (Marlsen-Wilson): onset similarity is primary because of the incremental (serial) nature of speech • Cat activates cap, cast, cattle, camera, etc. • Rhymes won’t compete • NAM (Neighborhood Activation Model; Luce): global similarity is primary • Cat activates bat, rat, cot, cast, etc. • Rhymes among set of strong competitors • TRACE (McClelland & Elman): global similarity constrained by incremental nature of speech • Cohorts and rhymes compete, but with different time course Adapted from Jim Magnuson, “Interaction in language processing: Pragmatic constraints on lexical access”

  30. TRACE predicts different time course for cohorts and rhymes Adapted from Jim Magnuson, “Interaction in language processing: Pragmatic constraints on lexical access”

  31. TRACE predictions match eye-tracking data Adapted from Jim Magnuson, “Interaction in language processing: Pragmatic constraints on lexical access”

  32. Natural contexts are used continuously • Conclusion from this and other eye-tracking studies: • When constraints from natural contexts are extremely predictive, they are integrated as quickly as we can measure • Suggests rapid, continuous interaction among • Linguistic levels • Nonlinguistic context • Even for processes assumed to be low-level and automatic • Constrains processing theories, also has implications for, e.g., learnability Adapted from Jim Magnuson, “Interaction in language processing: Pragmatic constraints on lexical access”

  33. Eye movement paradigm • More sensitive than conventional paradigms • More naturalistic • Simultaneous measures of multiple items • Transparently linkable to computational model Adapted from Jim Magnuson, “Interaction in language processing: Pragmatic constraints on lexical access”

  34. Eye-tracker without headsets http://www.bcs.rochester.edu/infanteyetrack/eyetrack.html

  35. Recap: Goals of psycholinguistic studies • Direct goal: finding out what affect sentence processing • Indirect goal: getting at how words, syntax, concepts are represented in the brain • Modeling: testing out these hypotheses with computational models

  36. Areas studied in psycholinguistics • Lexical access / lexical structure • Syntactic structure • Referent selection • The role of working memory • Disfluencies

  37. Disfluencies and new information • Disfluencies: pause, repetition, restart • Often just seen as production / comprehension difficulties • Arnold, Fagnano, and Tanenhaus (2003) • How are disfluent references interpreted? • Componenets to referent selection • lexical meaning • discourse constraints

  38. Candle, camel, grapes, salt shaker • DISCOURSE-OLD CONTEXT: Put the grapes below the candle.DISCOURSE-NEW CONTEXT: Put the grapes below the camel. • b. FLUENT: Now put the candle below the salt shaker.DISFLUENT: Now put theee, uh, candle below the salt shaker.

  39. Predictions on 4 conditions: (Target = candle) • Disfluent/New, Fluent/Given: Target • Put the grapes below the camel.Now put theee, uh, candle below the salt shaker. • Put the grapes below the candle.Now put the candle below the salt shaker. • Disfluent/Given, Fluent/New: Competitor • Put the grapes below the candle.Now put theee, uh, candle below the salt shaker. • Put the grapes below the camel.Now put the candle below the salt shaker.

  40. Disfluencies affect what we look at Percentage of fixations on all new objects from 200 to 500 ms after the onset of “the”/“theee uh” (i.e. before the onset of the head noun)

  41. Target is preferred in two conditions Percentage of target fixations minus percentage competitor fixations in each condition. Fixations cover 200–500 ms after the onset of the head noun.

  42. A lot of information is integrated in sentence processing! • Stroop test [i.e. color words]: form, meaning • Tanenhaus et al (1997) [i.e. “they all rose”]: phonology, meaning, syntactic category • Allopena et al (1998) [i.e. cohorts & rhymes]:phonology, visual context • Arnold et al (2003) [i.e. “theee, uh, candle”]:discourse information, visual context

  43. Producing words from pictures or from other words A comparison of aphasic lexical access from two different input modalities Gary Dellwith Myrna Schwartz, Dan Foygel, Nadine Martin, Eleanor Saffran, Deborah Gagnon, Rick Hanley, Janice Kay, Susanne Gahl, Rachel Baron, Stefanie Abel, Walter Huber

  44. A 2-step Interactive Model of Lexical Access in Production Semantic Features FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas Adapted from Gary Dell, “Producing words from pictures or from other words”

  45. 1. Lemma Access: Activate semantic features of CAT Semantic Features FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas Adapted from Gary Dell, “Producing words from pictures or from other words”

  46. 1. Lemma Access: Activation spreads through network FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas Adapted from Gary Dell, “Producing words from pictures or from other words”

  47. 1. Lemma Access: Most active word from proper category is selected and linked to syntactic frame NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas Adapted from Gary Dell, “Producing words from pictures or from other words”

  48. 2. Phonological Access: Jolt of activation is sent to selected word NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas Adapted from Gary Dell, “Producing words from pictures or from other words”

  49. 2. Phonological Access: Activation spreads through network NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas Adapted from Gary Dell, “Producing words from pictures or from other words”

  50. 2. Phonological Access: Most activated phonemes are selected Syl FOG DOG CAT RAT MAT On Vo Co f r d k m ae o t g Onsets Vowels Codas Adapted from Gary Dell, “Producing words from pictures or from other words”

More Related