twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.

Home Page:https://twinny.dev

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Twinny stops working after machine goes in standby

onel opened this issue · comments

Describe the bug
Not sure if this is twinny or an ollama issue but the chat feature seems to stop working after the machine goes in stand by.
Using M1 mac air
Ventura 13.4

To Reproduce
Steps to reproduce the behavior:

  1. Start ollama with ollama run ...
  2. VS code + twinny running normally
  3. Close the laptop lid
  4. Open back up
  5. ollama is still responsive if using the terminal
  6. Sending a message through twinny in VS code shows the loading indicator indefinitely

Expected behavior
Twinny chat should continue working

Restarting VS code, ollama doesn't seem to fix the problem so I'm wondering if it's also something else.
Ollama works ok through the CLI

Hmm, I have not experienced this issue.

I don't have a laptop/macbook so unable to test what might be happening, does the Ollama api respond separately?

Stale