This project is base on π€ transformer. This tutorial show you how to train your own language(such as chinese or Japanese) GPT2 model in a few code with Tensorflow 2.
You can try this project in colab right now.
βββ configs
β βββ test.py
β βββ train.py
βββ build_tokenizer.py
βββ predata.py
βββ predict.py
βββ train.py
git clone git@github.com:mymusise/gpt2-quickly.git
cd gpt2-quickly
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
this is a example of raw dataset: raw.txt
python build_tokenizer.py
python predata.py --n_processes=2
python train.py
python predict.py
ENV=FINETUNE python finetune.py