Cannot chat successfully with ollama
brunoais opened this issue · comments
Describe the bug
No matter what I try to chat with ollama, I get Sorry, I don’t understand. Please try again.
When I use ollama run
with the same model, it works as expected.
I'm using codellama:7b-instruct
in both cases.
For clarification: FIM is working as expected.
To Reproduce
Steps to reproduce the behavior:
- Try to chat with bot
- Get error response
Expected behavior
Chatting with the bot succeeds
Screenshots
Desktop (please complete the following information):
N/A
Additional context
I'm using the default system message provided by twinny. I also tried blanking the template and trying again but made no difference.
Twinny 2.X was working but I have no means to downgrade the version. At least, not in the UI.
See also: #191 because the cause can be related, just invisible to the user
Hey, this was updated as Ollama now support OpenAI specifictaion.
https://ollama.com/blog/openai-compatibility
Please update Ollama and try again.
Updating ollama fixed. ollama was outdated.