Can I change the language to model to replicate/llama-2-70b-chat?
izzuddin-noormy opened this issue · comments
Do you want to lead the charge here? The only work is to find usages of openai.ts and replace them with a call to llama-2 if an env variable is set - ideally making a layer between the callsites and openai.ts so the check is only done in one place.
Some spots may still want openai in the future if they use functions or something, but we can put that on hold for now
Check out #184 which adds ollama support! Landing soon, if someone can test it out locally
Landed - reopen if it doesn't suite your needs. Hopefully pointing the URL to replicate works, I've only tested with local llama