Official repository of Dark Experience for General Continual Learning: a Strong, Simple Baseline
- Use
./utils/main.py
to run experiments. - Use argument
--load_best_args
to use the best hyperparameters from the paper. - New models can be added to the
models/
folder. - New datasets can be added to the
datasets/
folder.
- Gradient Episodic Memory (GEM)
- A-GEM
- A-GEM with Reservoir (A-GEM-R)
- Experience Replay (ER)
- Meta-Experience Replay (MER)
- Function Distance Regularization (FDR)
- Greedy gradient-based Sample Selection (GSS)
- Hindsight Anchor Learning (HAL)
- Incremental Classifier and Representation Learning (iCaRL)
- online Elastic Weight Consolidation (oEWC)
- Synaptic Intelligence
- Learning without Forgetting
- Progressive Neural Networks
- Dark Experience Replay (DER)
- Dark Experience Replay++ (DER++)
Class-Il / Task-IL settings
- Sequential MNIST
- Sequential CIFAR-10
- Sequential Tiny ImageNet
Domain-IL settings
- Permuted MNIST
- Rotated MNIST
General Continual Learning setting
- MNIST-360
@inproceedings{buzzega2020dark,
title={Dark Experience for General Continual Learning: a Strong, Simple Baseline},
author={Buzzega, Pietro and Boschini, Matteo and Porrello, Angelo and Abati, Davide and Calderara, Simone},
booktitle=Advances in Neural Information Processing Systems 33,
year={2020}
}