1 / 33

Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Hebbian Constraint on the Resolution of the Homunculus Fallacy Leads to a Network that Searches for Hidden Cause-Effect Relationships. Andras Lorincz andras.lorincz@elte.hu http://nipg.inf.elte.hu. Content. Homunculus fallacy and resolution Hebbian architecture step-by-step

damian
Télécharger la présentation

Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hebbian Constraint on the Resolution of the Homunculus Fallacy Leads to a Network that Searches for Hidden Cause-Effect Relationships Andras Lorincz andras.lorincz@elte.hu http://nipg.inf.elte.hu

  2. Content • Homunculus fallacy and resolution • Hebbian architecture step-by-step • Outlook to neurobiology • Cognitive Map: the hippocampal formation (in rats) • Extensions to control and reinforcement learning • Conjecture about consciousness • Conclusions

  3. The Homunculus Fallacy How do we know that this is a phone?

  4. Democrit’s Answer Small phone atoms fly away, leave a ‘print’ – a representation of the phone – on our eyes and this is how we know

  5. Fallacy Someone ‘should make sense’ of the print made by the phone atom on the eye: Who is that reader? What kind of representation is he using? Who makes sense of the representation? Infinite regression

  6. Root of fallacy is in the wording • We transform the infinte regression • into finite architecture • with convergent dynamics

  7. Root of fallacy is in the wording • We transform the infinte regression • into finite architecture • with convergent dynamics (Not the representation but the) input makes sense provided that the representation can reconstruct the input (given the experiences) • In other words: • the representation • can produce an output, which is similar to the input of the network

  8. Architecture with Hebbian learning • x: input • h: hidden representation • y: reconstructed input (should match x) • W: bottom-up matrix, or BU transformation • M: top-down matrix, or TD transformation Hebbian (or local) learning: Components of the matrices (transformations) make the LTM of the system Locality of learning warrants graceful degradation for the architecture

  9. Architecture with Hebbian learning x: input h: hidden representation y: reconstructed input (should match x) W: bottom-up matrix, or BU transformation M: top-down matrix, or TD transformation

  10. Previous New: we compare ε: reconstruction error: x–y

  11. New: wecompare ε: reconstructionerror: x–y Previous

  12. New: we compare ε: reconstruction error: x–y Previous IT IS NOT Hebbian Slow.Wehavetocompensate

  13. New: we compare ε: reconstruction error: x–y Previous Hebbian. Slow.Wehavetocompensate

  14. New: we learn to predict ε(t+1): innovation: x(t+1)–y(t+1) ε(t+1) = x(t+1)–y(t+1) = x(t+1)–Mh(t) Previous Hebbian. Fast. Works forchanginginputs Hiddenmodelcanworkin the absence of input

  15. Conditions

  16. Conditions AutoRegressive (AR) process with recurrent network F h(t+1)=Fh(t)+ε(t+1) h: hidden state F: hidden deterministic dynamics nh: hidden innovation “causing” the process • M: subtractspredictive part • computesinnovation • F: addspredictive part • makeshiddenmodel • Learning of F • two-phaseoperation • supervisedlearning • Phase I: x(t+1) • PhaseII: ε(t+1)

  17. Cause-effect relations • Cause: innovationof the autoregressiveprocess • Effect: deterministic dynamics‘played’ bymatrixF

  18. Cause-effect relations • Cause: innovationof the autoregressiveprocess • Effect: deterministic dynamics‘played’ bymatrixF • One cansearchforhidden and independentcauses Architecture becomes more sophisticated: • independent component analysis • representation of independent causes • representation of hidden state variables

  19. Generalization:AutoRegressive Independent Process Analysis (AR-IPA) Double loop: Both the state and the innovation are represented

  20. “Cognitive Map” (in rats) Hippocampal formation Lorincz-Szirtes Autoregressive model of the hippocampal representation of events IJCNN Atlanta, June 14-19, 2009

  21. “Cognitive Map” (in rats) Hippocampal formation Lorincz-Szirtes Autoregressive model of the hippocampal representation of events IJCNN Atlanta, June 14-19, 2009 Similar anatomical structure Similar opertational properties Two-phase operation

  22. “Cognitive Map” (in rats) Hippocampal formation Lorincz-Szirtes Autoregressive model of the hippocampal representation of events IJCNN Atlanta, June 14-19, 2009 A single additional piece CA3—DG: eliminates echoes (ARMA-IPA)

  23. “Cognitive Map” (in rats) Hippocampal formation Lorincz-Szirtes Autoregressive model of the hippocampal representation of events IJCNN Atlanta, June 14-19, 2009 • Learns places and directions • path integration / planning (dead reckoning)

  24. Extensions of the network • AR can be embedded into reinforcement learning Kalman-filter and RL: Szita, Lorincz, Neural Computation, 2004 Echo State Networs and RL: Szita, Gyenes, Lorincz, ICANN, 2006 • AR can be extended with control (ARX) and active (Bayesian) learning Poczos, Lorincz, Journal of Machine Learning Research, 2009

  25. Consciousness Consider an overcomplete hidden representation made of a set of recurrent networks • Consider a set of echostatenetworks • Temporalextension • all of themcanreconstructwith more or less errors • whichonetouse? • theycancompeteover time • the winner represents a finite part of the past and the future

  26. Consciousness Consider an overcomplete hidden representation made of a set of recurrent networks • Consider a set of echostatenetworks • Temporalextension • all of themcanreconstructwith more or less errors • whichonetouse? • theycancompeteover time • the winner represents a finite part of the past and the future

  27. Consciousness Consider an overcomplete hidden representation made of a set of recurrent networks • Consider a set of echostatenetworks • Temporalextension • all of themcanreconstructwith more or less errors • whichonetouse? • theycancompeteover time • the winner represents a finite part of the past and the future

  28. Consciousness • Consider a set of echostatenetworks • Temporalextension • all of themcanreconstructwith more or less errors • whichonetouse? • theycancompeteover time • the winner represents a finite part of the past and the future Consider an overcomplete hidden representation made of a set of recurrent networks

  29. Consciousness Consider an overcomplete hidden representation made of a set of recurrent networks This model can explain rivalry situations • Consider a set of echostatenetworks • Temporalextension • all of themcanreconstructwith more or less errors • whichonetouse? • theycancompeteover time • the winner represents a finite part of the past and the future

  30. Conclusions Resolution of the fallacy plus Hebbian constraints lead to a structure that • resembles the “Cognitive Map” of rats

  31. Conclusions Resolution of the fallacy plus Hebbian constraints lead to a structure that • resembles the “Cognitive Map” of rats • searches for hidden cause-effect relationships

  32. Conclusions Resolution of the fallacy plus Hebbian constraints lead to a structure that • resembles the “Cognitive Map” of rats • searches for hidden cause-effect relationships • Questions for future work What kind of networks arise from the extensions, i.e., • Kalman filter embedded into reinforcement learning • Bayesian actively controlled learning if the constraint of Hebbian learning is taken rigorously.

  33. Thank you for your attention! .

More Related