rjmacarthy / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.

Home Page:https://rjmacarthy.github.io/twinny-docs/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

something wrong with the new update no ability to add model name. what could the issue be?

spirobel opened this issue · comments

Screenshot from 2024-04-11 03-13-03
mation):**

hi do I need to reset my settings?
somehow the plugin stopped generating completions.

those are the models i used before
(base) test@test:~$ ollama list
NAME ID SIZE MODIFIED
deepseek-coder:33b-instruct acec7c0b0fd9 18 GB 5 weeks ago
deepseek-coder:6.7b-base-q5_K_M 5d67f76f7c57 4.8 GB 5 weeks ago
dolphin-mixtral:latest cfada4ba31c7 26 GB 5 weeks ago
stable-code:3b-code-q4_0 aa5ab8afb862 1.6 GB 5 weeks ago

but now all the model settings are gone and there is no way to pick the model settings in the new ui. tried uninstalling and reinstlalling. still no work .,.. maybe it is because the structure of the config changed and the old one is still saved... any idea how to fix this?

anyway! thanks for the plugin! its a great tool. keep up the good work.

okay nvm after deleting and resetting the profiles it works again. (and rebooting ollama)

Thanks, it was probably some configuration issue because the minor change which has some new settings for api connections and I expected some breaking changes. I just pushed a fix for the 0 displaying because the models weren't loading.

Thanks for your work! my 3090 goes brr again
I had to reset the settings and then select the models again. like chat needs to be top and then fim. but now it works.
image

btw how to select the active provider... when I add more than just 1 fim and chat provider
image
I tried to copy one then fim was on top and chat copy was second but then chat become unresponsive, maybe fim too.
It said ollama chat Copy in the title of the second provider. ( then I pressed reset, and selected my models again and it worked.) ( not complaining, I am happy lol)

Hey, I am not sure if still an issue :). To select the provider click the little robot icon above the chat box. I see in your FIM provider you use deepseek model but the template is codellama, you should use automatic or deepseek as your fim template too.