ThomasScialom / T0_continual_learning

Adding new tasks to T0 without catastrophic forgetting

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

T0_continual_learning

Adding new tasks to LLM without catastrophic forgetting

Paper:

https://arxiv.org/abs/2205.12393

Models:

https://huggingface.co/ThomasNLG/CT0-11B

Code:

We haven't cleaned up all the code yet, but most of the different steps can be found in this Collab.

In particular the notebook contains

  • The steps to create the dataset and to format them with the reheasarl. Note that for some raw datasets it might not be possible anymore to download them (e.g. broken link on HF hub). You can still find the processed data in our main folders, just as the formated datasets.
  • The evaluation scripts.

For training, we plan to release the scripts. But you dont wait for it, we applied nothing fancy, simply finetuning T5 using the standard HF framework. All tehe parameters are mentioned in our paper.

Material:

All the material required in the notebook etc., including the training data, the predictions and the checkpoints are publicly available in main folders

About

Adding new tasks to T0 without catastrophic forgetting

License:MIT License


Languages

Language:Python 100.0%