1 / 54

Cognitive Modeling

Cognitive Modeling. Gary Cottrell Computer Science and Engineering Department Institute for Neural Computation Temporal Dynamics of Learning Center UCSD. Cognitive Modeling …with a brief introduction to neural networks. Gary Cottrell Computer Science and Engineering Department

gausej
Télécharger la présentation

Cognitive Modeling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cognitive Modeling Gary Cottrell Computer Science and Engineering Department Institute for Neural Computation Temporal Dynamics of Learning Center UCSD OSHER

  2. Cognitive Modeling…with a brief introduction to neural networks Gary Cottrell Computer Science and Engineering Department Institute for Neural Computation Temporal Dynamics of Learning Center UCSD OSHER

  3. Cognitive Modeling…with a brief introduction to neural networks…and the Interactive Activation Model Gary Cottrell Computer Science and Engineering Department Institute for Neural Computation Temporal Dynamics of Learning Center UCSD OSHER

  4. Outline I. Motivation: Why build cognitive models? II. Human-style computation: What's it like? III. Neural nets: What are they like? IV. The Interactive Activation Model V. Conclusions COGS 200 Cognitive Modeling

  5. Why model? • Models rush in where theories fear to tread. • Models can be manipulated in ways people cannot • Models can be analyzed in ways people cannot. COGS 200 Cognitive Modeling

  6. Models rush in where theories fear to tread • Theories are high level descriptions of the processes underlying behavior. • They are often not explicit about the processes involved. • They are difficult to reason about if no mechanisms are explicit -- they may be too high level to make explicit predictions. • Theory formation itself is difficult. COGS 200 Cognitive Modeling

  7. Models rush in where theories fear to tread • Using machine learning techniques, one can often build a working model of a task for which we have no theories or algorithms (e.g., expression recognition). • A working model provides an “intuition pump” for how things might work, especially if they are “neurally plausible” (e.g., development of face processing - Dailey and Cottrell). • A working model may make unexpected predictions (e.g., the Interactive Activation Model and SLNT). COGS 200 Cognitive Modeling

  8. Models can be manipulated in ways people cannot • We can see the effects of variations in cortical architecture (e.g., split (hemispheric) vs. non-split models (Shillcock and Monaghan word perception model)). • We can see the effects of variations in processing resources (e.g., variations in number of hidden units in Plaut et al. models). COGS 200 Cognitive Modeling

  9. Models can be manipulated in ways people cannot • We can see the effects of variations in environment (e.g., what if our parents were cans, cups or books instead of humans? I.e., is there something special about face expertise versus visual expertise in general? (Sugimoto and Cottrell, Joyce and Cottrell, Tong & Cottrell)). • We can see variations in behavior due to different kinds of brain damage within a single “brain” (e.g. Juola and Plunkett, Hinton and Shallice). COGS 200 Cognitive Modeling

  10. Models can be analyzed in ways people cannot In the following, I specifically refer to neural network models. • We can do single unit recordings. • We can selectively ablate and restore parts of the network, even down to the single unit level, to assess the contribution to processing. • We can measure the individual connections -- e.g., the receptive and projective fields of a unit. • We can measure responses at different layers of processing (e.g., which level accounts for a particular judgment: perceptual, object, or categorization? (Dailey et al. J Cog Neuro 2002). COGS 200 Cognitive Modeling

  11. How (I like) to build Cognitive Models • Choose an area where there is a lot of data - even better if there is controversy! • I like to be able to relate them to the brain, so “neurally plausible” models are preferred -- neural nets. • The model should be a working model of the actual task, rather than a cartoon version of it. • Of course, the model should nevertheless be simplifying (i.e. it should be constrained to the essential features of the problem at hand): • Do we really need to model the (supposed) translation invariance and size invariance of biological perception? • As far as I can tell, NO! • Then, take the model “as is” and fit the experimental data: 0 fitting parameters is preferred over 1, 2 , or 3. COGS 200 Cognitive Modeling

  12. The other way (I like) to build Cognitive Models • Same as above, except: • Use them as exploratory models -- in domains where there is little direct data (e.g. no single cell recordings in infants or undergraduates) to suggest what we might find if we could get the data. These can then serve as “intuition pumps.” • Examples: • Why we might get specialized face processors (Dailey & Cottrell, 1999) • Why those face processors get recruited for other tasks (Sugimoto & Cottrell, 2001, and following papers) COGS 200 Cognitive Modeling

  13. So, Why model? • Models rush in where theories fear to tread. • Models can be manipulated in ways people cannot • Models can be analyzed in ways people cannot. COGS 200 Cognitive Modeling

  14. Outline I. Motivation: Why build cognitive models? II. Human-style computation: What's it like? III. Neural nets: What are they like? IV. The Interactive Activation Model V. Conclusions COGS 200 Cognitive Modeling

  15. Mutual Constraints • People are able to combine lots of different kinds of knowledge quickly in understanding English - • For example, in understanding the relationships given in a sentence: • Syntax (structure) gives us some information: The boy kissed the girl. COGS 200 Cognitive Modeling

  16. Mutual Constraints • But usually we need semantics (meaning) too: I saw the man on the hill with the big hat. I saw the man on the hill with my telescope. • These have identical parts: [Pronoun] [Verb] [Noun Phrase] [Prep. Phrase] [Prep. Phrase] • But how those parts go together depends on the meanings… I saw [the man [on the hill] [with the big hat]]. I saw [the man [on the hill]] with my telescope. COGS 200 Cognitive Modeling

  17. Mutual Constraints Ditto for pronoun reference: “The city council refused the demonstrators a permit because they were communists.” Who are they? COGS 200 Cognitive Modeling

  18. Mutual Constraints “The city council refused the demonstrators a permit because they were communists.” Who are they? In San Diego, it is the demonstrators who are likely to be the communists COGS 200 Cognitive Modeling

  19. Mutual Constraints “The city council refused the demonstrators a permit because they were communists.” Who are they? In Berkeley, it is the city council who are likely to be the communists! ;-) COGS 200 Cognitive Modeling

  20. Mutual Constraints: Word Sense Disambiguation The most frequent words (in any language) are the most ambiguous! How do we deal with this ambiguity? Answer: We combine constraints from many sources… COGS 200 Cognitive Modeling

  21. Mutual Constraints: Word Sense Disambiguation Things that people do well involve integrating constraints from multiple sources: Discourse Context: "I'm taking thecar” Grammar: "The carpenter saw the wood” Meaning frequency: "Bob threwa ball” COGS 200 Cognitive Modeling

  22. Mutual Constraints: Word Sense Disambiguation Semantics (meaning): Associations between word senses: "dog's bark" "deeppit” The fit between roles and role fillers: "Bob threw the fight" COGS 200 Cognitive Modeling

  23. Mutual Constraints: Word Sense Disambiguation Pragmatic: "The man walked on the deck” "Nadia swung the hammer at the nail and the head flew off” (Hirst, 1983) COGS 200 Cognitive Modeling

  24. Computers are different… They are fine with sentences like: The boy the girl the dog bit liked cried. You don’t think that’s a sentence? COGS 200 Cognitive Modeling

  25. Computers are different… The boy cried. The boy the girl liked cried. The boy the girl the dog bit liked cried. COGS 200 Cognitive Modeling

  26. Computers are different… But… U CN REED THIIS CANDT YU? COGS 200 Cognitive Modeling

  27. Mutual Constraints:Audience Participation!Read this aloud with me: COGS 200 Cognitive Modeling

  28. Read these aloud with me: COGS 200 Cognitive Modeling

  29. Read these aloud with me: COGS 200 Cognitive Modeling

  30. But… COGS 200 Cognitive Modeling

  31. Who do you see? • Context influences perception COGS 200 Cognitive Modeling

  32. How do humans compute? • They are fast at using mutual constraints to: • Understand sentences • Disambiguate words • Read ambiguous letters • Recognize faces (and sometimes the constraints lead us astray!) COGS 200 Cognitive Modeling

  33. Outline I. Motivation: Why build cognitive models? II. Human-style computation: What's it like? III. Neural nets: What are they like? IV. The Interactive Activation Model V. Conclusions COGS 200 Cognitive Modeling

  34. Motivation • Why are people (still) smarter than machines? • Is it because we have faster hardware? • No • Is it because we have better programs? • No • It’s because we have brains! ;-) • Brains are massively parallel machines COGS 200 Cognitive Modeling

  35. Motivation • Why are people (still) smarter than machines? • Basic differences in architecture • Hence some of us study brain-like computational models: Neural nets • “Cartoon versions“ of how the brain works COGS 200 Cognitive Modeling

  36. How do neural networks compute? • The 100-step program constraint(Feldman, 1985): • Neurons (brain cells) are slow: they take about 10ms to respond • Yet mental events take about half a second (500ms) • So there are no more than 50 steps possible to arrive at an answer! • This suggests: • Massive parallelism • Simple steps COGS 200 Cognitive Modeling

  37. Constraints on a brain model (Francis Crick) COGS 200 Cognitive Modeling

  38. A real neuron (Ramon Y Cajal, 1899) • Axon (output) • Cell body (soma) • Dendrites (input) COGS 200 Cognitive Modeling

  39. A model “neuron” • Axon (output) • Cell body (soma) • Dendrites (input) COGS 200 Cognitive Modeling

  40. A model “neuron” • Inputs (from another unit or the outside world) • Connection strengths (or weights) • Internal “potential” • Output, representing firing frequency COGS 200 Cognitive Modeling

  41. Neural nets (PDP nets, connectionist networks) • Networks of simple units • Connected by weighted links • Compute by spreading activation and inhibition COGS 200 Cognitive Modeling

  42. The Interactive Activation Model:A model of reading from print • Assumptions: • Levels of abstraction • Spatially parallel • Temporally parallel (all levels at the same time) • Interactive • A neural net COGS 200 Cognitive Modeling

  43. The Interactive Activation Model:A model of reading from print • Assumptions: • Levels of abstraction • Spatially parallel • Temporally parallel (all levels at the same time) • Interactive • A neural net COGS 200 Cognitive Modeling

  44. The Interactive Activation Model:A model of reading from print • Word level • Letter level • Feature level COGS 200 Cognitive Modeling

  45. The Interactive Activation Model:Data it was built to explain The word-superiority effect (Cattell, 1886; Reicher, 1969): Time Word Stimulus Non-word StimulusLetter Alone 70ms WALK XDYK K MASK ######## #### ^ ^ 2AFC L or K? L or K? L or K? Word stimulus results in best accuracy. Note that the result can’t be due to post-perceptual guessing, since both letters fit the word. COGS 200 Cognitive Modeling

  46. The Interactive Activation Model:Data it was built to explain COGS 200 Cognitive Modeling

  47. Operation of the model COGS 200 Cognitive Modeling

  48. Operation of the model COGS 200 Cognitive Modeling

  49. Example of data accounted for… • This is from an experiment where subjects either saw a word or a letter in a mask: e.g., ##B# vs DEBT Contrast and mask vs. no mask were varied. • Main point: difference in size of effect in the two conditions: much bigger for bright display&mask Model: 81 66 78 68 COGS 200 Cognitive Modeling

  50. Example of data accounted for… • Results for bright target, bright mask COGS 200 Cognitive Modeling

More Related