rjmacarthy / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.

Home Page:https://rjmacarthy.github.io/twinny-docs/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Cannot chat successfully with ollama

brunoais opened this issue · comments

Describe the bug

No matter what I try to chat with ollama, I get Sorry, I don’t understand. Please try again.

When I use ollama run with the same model, it works as expected.
I'm using codellama:7b-instruct in both cases.

For clarification: FIM is working as expected.

To Reproduce

Steps to reproduce the behavior:

  1. Try to chat with bot
  2. Get error response

Expected behavior

Chatting with the bot succeeds

Screenshots

image
image
image
image

Desktop (please complete the following information):

N/A

Additional context

I'm using the default system message provided by twinny. I also tried blanking the template and trying again but made no difference.
Twinny 2.X was working but I have no means to downgrade the version. At least, not in the UI.

See also: #191 because the cause can be related, just invisible to the user

Hey, this was updated as Ollama now support OpenAI specifictaion.

https://ollama.com/blog/openai-compatibility

Please update Ollama and try again.

Updating ollama fixed. ollama was outdated.