forrestdavis / gpt2-recycle

As good as new. How to successfully recycle English GPT-2 to make models for other languages

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

GPT-2 Recycled for Italian and Dutch

Wietse de VriesMalvina Nissim

Model description

In our paper, we describe a multi-stage adaptation method for transfering GPT-2 to Italian and Dutch without unnecessary retraining. This repository contains the source code and the final models are available on the Hugging Face model hub (see below).

We publish two types of models:

  • Models where only the lexical layer is retrained for the new language and the Transformer layers are the same as the English model. The lexical layers of these models are in practice automatically aligned with the equivalent English model. Use this if you are interested in alignment properties.
  • Models with retrained lexical embeddings and then additional training of the full models. Use this if you want to generate more realistic text.

For details, check out our paper on arXiv and the models on the 🤗 Hugging Face model hub (see links for specific models below).

Models

Dutch

Italian

How to use

from transformers import pipeline

pipe = pipeline("text-generation", model="GroNLP/gpt2-small-dutch")
print(pipe('Was ik maar een'))
from transformers import AutoTokenizer, AutoModel, TFAutoModel

tokenizer = AutoTokenizer.from_pretrained("GroNLP/gpt2-small-dutch")
model = AutoModel.from_pretrained("GroNLP/gpt2-small-dutch")  # PyTorch
model = TFAutoModel.from_pretrained("GroNLP/gpt2-small-dutch")  # Tensorflow

BibTeX entry

@misc{devries2020good,
      title={As good as new. How to successfully recycle English GPT-2 to make models for other languages}, 
      author={Wietse de Vries and Malvina Nissim},
      year={2020},
      eprint={2012.05628},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

About

As good as new. How to successfully recycle English GPT-2 to make models for other languages

License:Apache License 2.0


Languages

Language:Jupyter Notebook 81.2%Language:Python 18.8%