1 / 9

Continual Learning with Echo State Networks

Experimental evaluation of Echo State Networks in class-incremental continual learning scenarios.

andcos
Télécharger la présentation

Continual Learning with Echo State Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Continual Learning with Echo State Networks are random recurrent networks suitable for continual learning? Andrea Cossu, Davide Bacciu, Antonio Carta, Claudio Gallicchio, Vincenzo Lomonaco Scuola Normale Superiore, University of Pisa ESANN 2021

  2. Learning continuously without forgetting an open challenge DRIFT DRIFT Quickly learn new experiences Exploit existing knowledge Mitigate catastrophic forgetting …

  3. Recurrent models in continual learning Everything is perceived as a sequence… more or less! What is the impact of existing CL strategies on recurrent models? A. Cossu, A. Carta, V. Lomonaco, D. Bacciu, Continual Learning for Recurrent Neural Networks: An Empirical Evaluation, Neural Networks, 2021.

  4. Random recurrent networks for CL You can’t forget if you are not changing No recurrent parameters to learn Apply CL strategies on trainable parameters only Treat the untrained components as pre-trained network

  5. Echo State Network

  6. Experiments setting Avalanche library Split row-MNIST ● 5 experiences, 2 classes each Synthetic Speech Commands ● 10 experiences, 2 classes each EWC, LwF, Replay, Streaming Deep LDA, Naive and Joint Training ESN and LSTM https://github.com/ContinualAI/avalanche

  7. Results

  8. A promising future Expand the analysis ● reservoir topologies ● unsupervised finetuning Continual learning on low-powered devices ● neuromorphic hardware Ad-hoc strategies ● exploit readout linearity

  9. Questions?

More Related