dasepli / gpt2-quickly

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

GPT2 Quickly

Build your own GPT2 quickly, without doing many useless work.

Build

This project is base on πŸ€— transformer. This tutorial show you how to train your own language(such as chinese or Japanese) GPT2 model in a few code with Tensorflow 2.

You can try this project in colab right now.

Main file


β”œβ”€β”€ configs
β”‚   β”œβ”€β”€ test.py
β”‚   └── train.py
β”œβ”€β”€ build_tokenizer.py
β”œβ”€β”€ predata.py
β”œβ”€β”€ predict.py
└── train.py

Preparation

git clone git@github.com:mymusise/gpt2-quickly.git
cd gpt2-quickly
python3 -m venv venv
source venv/bin/activate

pip install -r requirements.txt

0x00. prepare your raw dataset

this is a example of raw dataset: raw.txt

0x01. Build vocab

python build_tokenizer.py

0x02. Tokenize

python predata.py --n_processes=2

0x03 Train

python train.py

0x04 Predict

python predict.py

0x05 Fine-Tune

ENV=FINETUNE python finetune.py

About


Languages

Language:Python 100.0%