bclarkson-code / Tricycle

Autograd to GPT-2 completely from scratch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add inference

bclarkson-code opened this issue · comments

Currently, there is no way to get predictions out of a language model. This should be added. This will involve:

  • Adding an encode and decode method to the tokeniser for easier use
  • Add something to convert logits to tokens
  • Building an inference loop that passes tokens through the model, generates a prediction, appends it and repeats