rjmacarthy / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.

Home Page:https://rjmacarthy.github.io/twinny-docs/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Default ollama `Chat Api Path` points to the wrong URL path

brunoais opened this issue · comments

Describe the bug
Default Chat Api Path when setting the Api Provider to ollama, sets Chat Api Path to /v1/chat/completions instead of the correct /api/chat (source)

Note: This seems to be some sort of copy/paste error or selection error because /v1/chat/completions is also the default for lmstudio

To Reproduce
Steps to reproduce the behavior:

  1. Go to twinny setting twinny.apiProvider
  2. Click on it and change to ollama (if it's already ollama, change to another and change back)
  3. Problem: Setting twinny.chatApiPath becomes /v1/chat/completions. Should have been /api/chat.

Screenshots
image
image

Hey, this was updated as Ollama now support OpenAI specifictaion.

https://ollama.com/blog/openai-compatibility

Please update Ollama and try again.

Updating ollama fixed. ollama was outdated.