1 / 27

Chapter Seven

Chapter Seven. The Network Approach: Mind as a Web. Connectionism. The major field of the network approach. Connectionists construct Artificial Neural Networks (ANNs), which are computer simulations of how groups of neurons might perform some task. Information processing.

vangie
Télécharger la présentation

Chapter Seven

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter Seven The Network Approach: Mind as a Web

  2. Connectionism • The major field of the network approach. • Connectionists construct Artificial Neural Networks (ANNs), which are computer simulations of how groups of neurons might perform some task.

  3. Information processing • ANNs utilize a processing strategy in which large numbers of computing units perform their calculations simultaneously. This is known as parallel distributed processing. • In contrast, traditional computers are serial processors, performing one computation at a time.

  4. Serial and parallel processing architectures

  5. Serial vs. Parallel Computing • No difference in computing power. • Parallel computing is simulated by general purpose computers. • Modern general purpose computers are not strictly serial.

  6. Approaches • The traditional approach in cognition and AI to solving problems is to use an algorithm in which every processing step is planned. (Not really.) It relies on symbols and operators applied to symbols. This is the knowledge-based approach. • Connectionists instead let the ANN perform the computation on its own without any (with less) planning. They are concerned with the behavior of the network. This is the behavior-based approach. (Not really.)

  7. Knowledge representation • Information in an ANN exists as a collection of nodes and the connections between them. This is a distributed representation. • Information in semantic networks, however, can be stored in a single node. This is a form of local representation.

  8. Characteristics of ANNs • A node is a basic computing unit. • A link is the connection between one node and the next. • Weights specify the strength of connections. • A node fires if it receives activation above threshold.

  9. Characteristics of ANNs • A basis function determines the amount of stimulation a node receives. • An activation function maps the strength of the inputs onto the node’s output. A sigmoidal activation function

  10. Early neural networks • Hebb (1949) describes two type of cell groupings. • A cell assembly is a small group of neurons that repeatedly stimulate themselves. • A phase sequence is a set of cell assemblies that activate each other. • Hebb Rule: When one cell repeatedly activates another, the strength of the connection increases.

  11. Early neural networks • Perceptrons were simple networks that could detect and recognize visual patterns. • Early perceptrons had only two layers, an input and an output layer.

  12. Modern ANNs • More recent ANNs contain three layers, an input, hidden, and output layer. • Input units activate hidden units, which then activate the output units.

  13. Backpropagation learning in ANNs • An ANN can learn to make a correct response to a particular stimulus input. • The initial response is compared to a desired response represented by a teacher. • The difference between the two, an error signal, is sent back to the network. • This changes the weights so that the actual response is now closer to the desired.

  14. Criteria of different ANNs • Supervised networks have a teacher. Unsupervised networks do not. • Networks can be either single-layer or multilayer. • Information in a network can flow forward only, a feed-forward network, or it can flow back and forth between layers, a recurrent network.

  15. Network typologies • Hopfield-Tank networks. Supervised, single-layer, and laterally connected. Good at recovering “clean” versions of noisy patterns. • Kohonen networks. An example of a two-layer, unsupervised network. Able to create topological maps of features present in the input. • Adaptive Resonance Networks (ART). An unsupervised multilayer recurrent network that classifies input patterns.

  16. Advantages: Biological plausibility Graceful degradation Interference Generalization Disadvantages: No massive parallelism Convergent dynamic Stability-plasticity dilemma Catastrophic interference Evaluating connectionism

  17. Semantic networks • Share some features in common with ANNs. • Individual nodes represent meaningful concepts. • Used to explain the organization and retrieval of information from LTM.

  18. Characteristics of semantic networks • Spreading activation. Activity spreads outward from nodes along links and activates other nodes. • Retrieval cues. Nodes associated with others can activate them indirectly. • Priming. Residual activation can facilitate responding.

  19. A hierarchical semantic network • Sentence verification tasks suggest a hierarchical organization of concepts in semantic memory (Collins and Quillian, 1969). • Meaning for concepts such as animals may be arranged into superordinate, ordinate, and subordinate categories. • Vertical distance in the network corresponds to category membership. • Horizontal distance corresponds to property information.

  20. Example ofA Hierarchical Semantic Network From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

  21. Propositional networks • Can represent propositional or sentence-like information. Example: “The man threw the ball.” • Allow for more complex relationships between concepts such as agents, objects, and relations. • Can also code for episodic knowledge.

  22. Example of A Propositional Semantic Network From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

  23. Episodic Memoryin Cassiea SNePS-Based Agent • *NOW contains SNePS term representing current time. • *NOW moves when Cassie acts or perceives a change of state.

  24. find Representation of Time before after before after ! ! ! event ????????????? time agent act B1 action object B6 I lex NOW

  25. before after before after ! t2 ! t3 NOW NOW NOW Movement of Time t1

  26. before after before after ! t2 ! t3 time ! event NOW NOW Performing a Punctual Act t1

  27. before after ! t2 supint ! subint t3 time ! event NOW NOW Performing a Durative Act t1

More Related