Integrate Ollama as an AI provider
brainless opened this issue · comments
Sumit Datta commented
As a user of Dwata I would like to use locally running LLMs through Ollama. Ollama can be installed from its released binaries/installers/Docker images and provides an API.