Feature Request - Choice to power from Local LLM server
TSM-EVO opened this issue · comments
It would be very handy to have the ability to connect to a local LLM server like OLLAMA, to have a truly local solution capable of generating these agents.
I'll consider it, but it would be 7 to 10 times slower.
Thanks... -jjg
…On Sun, May 12, 2024 at 10:18 PM Will H ***@***.***> wrote:
It would be very handy to have the ability to connect to a local LLM
server like OLLAMA, to have a truly local solution capable of generating
these agents.
—
Reply to this email directly, view it on GitHub
<#11>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAZ6GXECIW2WIFPVX757O4TZCAWHBAVCNFSM6AAAAABHTMEXU2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGI4TCNZRG43TCMI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>