Add Ollama as a LLM option
ptgoetz opened this issue · comments
P. Taylor Goetz commented
What?
Add Ollama as an LLM option.
Why?
Ollama allows you to run an LLM model/service locally with minimal effort. This is can be especially important, for example, when demoing the project and derivative projects in a network-disconnected environment.
Implementation Considerations
P. Taylor Goetz commented
Initial PR: #276