140 likes | 154 Vues
This article explores the successes, limitations, and future directions of neural network models of cognition in psychology. It discusses the topics of pattern recognition, intuitive physics, intuitive psychology, learning as model building, and model-based and model-free methods. It also addresses the challenges of addressing Frostbite and Omniglot using neural networks and the distinction between emergent intelligence and built-in intelligence.
E N D
Successes, Limitations, and Future Directions for Neural Network Models of Cognition Psychology 209 – Winter 2019March 7, 2019
Lake et al • Pattern recognition vs model building: • Cognition is about using models to understand the world, to explain what we see, to imagine what could have happened that didn’t, or what could be true that isn’t, and then planning actions to make it so.
Start up software • Intuitive physics • Infants have primitive object concepts that allow them to track objects over time and allow them to discount physically implausible trajectories – e.g. they know that objects will persist over time and that they are solid and coherent. • Intuitive psychology • Infants understand that other people have mental states like goals and beliefs, and this understanding strongly constrains their learning and predictions.
Learning as model building • Explaining observed data through the construction of causal models of the world. • ‘Early present capacities for intuitive physics and psychology are also causal models of the world’. • A primary job of learning is to extend and enrich these models and build analogous causally-structured theories of other domains. • Human learning is richer and more efficient than state-of-the-art algorithms in machine learning • Compositionality and learning to learn are ingredients that make this type of rapid model learning possible
Model Based and Model Free Methods • Using a model is cumbersome and slow; model free reinforcement learning can allow real-time ‘control’. • Humans combine MB and MF competitively and cooperatively
Two Challenges • Characters • Frostbite
One example:Omniglot • Classification of new examples • Generation of new examples • Parsing an object into its parts • Generation of new concepts from related examples
DQN learns Frostbyte slowly – people can do well from brief instruction or from watching a good player • Construct an igloo • Jump on white ice flows • Gather fish • Don’t fall in the water • Avoid geese & polar bears
Emergent intelligence vs built-in intelligence • Built in • May be easier to create • Since it is designed, it is likely to be easier to understand • You need to have just the right stuff to get the stuff you want to learn to fit within it • May not deal with quasiregularity • Emergent • Not as easy to create • Not as easy to understand • Deals with quairegularity • Involves less prior commitment to structure