is ai town launguge model must be using openai's GPT api?
jimmy8399 opened this issue · comments
is ai town launguge model must be using openai's GPT api? or its option ?
There's no other option at the moment. Feel free to add another integration.
What if I want to test a local language model, such as LLaMA? how should I revise the code?
What are all the models folks would like to see?
- LLaMA on Replicate
- LLaMA locally
Those are the top requests I've seen - any other interesting ones? And are folks looking for the llama-2-70b-chat
or something else?
want to use llama-2-70b-chat
locally
In #184 we add Ollama support which you can run locally for chat (but not any embeddings yet)
It landed! try it out and let us know how it's working for you. documented in readme