There are 2 repositories under elastic-weight-consolidation topic.
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
Elastic weight consolidation technique for incremental learning.
PyTorch implementation of a VAE-based generative classifier, as well as other class-incremental learning methods that do not store data (DGR, BI-R, EWC, SI, CWR, CWR+, AR1, the "labels trick", SLDA).
#WORK IN PROGRESS PyTorch Implementation of Supervised and Deep Q-Learning EWC(Elastic Weight Consolidation), introduced in "Overcoming Catastrophic Forgetting in Neural Networks"
comparative evaluation of incremental machine learning methods
Tensorflow 1.x implementation of EWC, evaluated on permuted MNIST
Multi domain adaption of quick sentiment analysis on mutliple catagories of task like classification of the nature of the reviews regrading various objects found in Amazon website
An investigation into sequential learning of tasks using feed-forward networks built with Tensorflow
CEL Continual Learning
This is the temporary version of the MINT Lab continual-learning website.
Federated Echo State Networks for Stress Prediction in the Automotive Use Case. Master Thesis in Artificial Intelligence @ University of Pisa
An investigation into weight importance measures in neural networks, relating to sequential learning and interpretability.