Twinny stops working after machine goes in standby
onel opened this issue · comments
Describe the bug
Not sure if this is twinny or an ollama issue but the chat feature seems to stop working after the machine goes in stand by.
Using M1 mac air
Ventura 13.4
To Reproduce
Steps to reproduce the behavior:
- Start ollama with
ollama run ...
- VS code + twinny running normally
- Close the laptop lid
- Open back up
- ollama is still responsive if using the terminal
- Sending a message through twinny in VS code shows the loading indicator indefinitely
Expected behavior
Twinny chat should continue working
Restarting VS code, ollama doesn't seem to fix the problem so I'm wondering if it's also something else.
Ollama works ok through the CLI
Hmm, I have not experienced this issue.
I don't have a laptop/macbook so unable to test what might be happening, does the Ollama api respond separately?
Stale