Norod / TrainGPT2-127M-FromScratch

A trio of Google-Colab notebooks (ipynb) for training a GPT-2 (127M) model from scratch (useful for other / non-English languages) using gpt-2-simple

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Norod/TrainGPT2-127M-FromScratch Stargazers