Use llama.cpp with Python grammar syntax
AlphaAtlas opened this issue · comments
llama.cpp just implemented a feature that forces the llm to conform to arbitrary grammar, like, say, Python:
This seems relevant for this repo. Maybe the grammar could even be swapped/generated programatically for more specific responses?
A language specific finetune of llamav2 also seems like a good idea, but perhaps that is something for another feature request...
Definitely agree that something like this could reduce the iterations required, currently the "second stage" in llm.py deals with this in a coarser way, linting the output (with very broad rules, we use an autoformatter at the end to make the generated code consistent, so it just confirms major issues like syntax mistakes, missing imports, etc) and feeding that back into the LLM if it fails.
First we've gotta get the work to generalize the code for non-OpenAI LLMs done, though. 😅