getzep / zep

Zep: Long-Term Memory for ‍AI Assistants.

Home Page:https://docs.getzep.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[FEAT] Suport Ollama

airtonix opened this issue · comments

Is your feature request related to a problem? Please describe.

Would like to be able to use Ollama instead of openai

We're unlikely to support Ollama. You can however use local LLMs with any inference server that provides an OpenAI-compatible API, such as LocalAI.

More here: https://docs.getzep.com/deployment/llm_config/