adey4 / toy-gpt

A toy Transformer model to demonstrate natural language generation capabilities on consumer hardware

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Toy-GPT

Toy-GPT is a decoder-only transformer built from scratch using NumPy and PyTorch, trained to generate natural language similar to input.txt

Dependencies

  • Python3
  • PyTorch: conda install pytorch torchvision -c pytorch

Limitations

  • Training takes ~1 hr on Apple's M1 Pro Chip
  • Language generation quality is limited by compute

Example Output

Trained on Shakespearean text:

SICINIUS:
Is it strange?

Herald:
He's deceited, and children from his new spid
Then whomen he dares to him: were he worse.

BRUTUS:
You have pirtly not him.

MENENIUS:
What's the prisoner have not a silfa?

MONTAGUE:
O, and both shame, Menenius. Stanless, Thou art purpose;
And said thou pen for thy melting there,--

BENVOLIO:
Two sir, the earth proofs rids too come hither;
I thank you out, as thought sook for Ireland,

FRIAR LAURENCE:
His son, do your morself, that leaven your honours
Sufferable in more and suffer five.
A horse! High-graced York rights. And bother Montague

Sources

About

A toy Transformer model to demonstrate natural language generation capabilities on consumer hardware


Languages

Language:Jupyter Notebook 70.3%Language:Python 29.7%