getzep / zep

Zep: Long-Term Memory for ‍AI Assistants.

Home Page:https://docs.getzep.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

No google gemini support yet?

HakaishinShwet opened this issue · comments

This is mentioned in website - "Zep uses OpenAI for chat history summarization, intent analysis, and, by default, embeddings. You can get an Open AI Key here." does it not work with latest google gemini ?
in docs too this is mentioned - "
Select the LLM you'll be using for your Assistant. We count tokens in Zep artifacts like Messages and Summaries to help you stay within your prompt token budget. Knowing your model choice ensures we use the correct tokenizer.

Available options are GPT 3.5 or 4 family and Llama2 and related."

so if we wanna use zep with google gemini no options there? google gemini released in december 2023 and in past week they launched better version of it so still no support or plans for it?

Zep Open Source requires an LLM provider. If you're using the open source server, please take a look at the documentation here: https://docs.getzep.com/deployment/llm_config/

Zep Cloud does not require an LLM provider for the services it provides. The following refers to selecting the LLM you are using for your app. We don't yet support token counting for Gemini family models.

Select the LLM you'll be using for your Assistant. We count tokens in Zep artifacts like Messages and Summaries to help you stay within your prompt token budget. Knowing your model choice ensures we use the correct tokenizer.

Available options are GPT 3.5 or 4 family and Llama2 and related.```

@danielchalef instead of closing this issue can you reopen it and add this to future task like in upcoming update it can support gemini family. Right now many use it and more demand will increase so rather than sticking to what already as been implemented it would be much better for this project to add support for gemini too because many are using it because of numerous reasons 1) its api usage is free altho it has limits but for max its working fine 2) it has beat gpt in many ways and has different pricing so people will want to try this out so i believe it should have been already supported by now but anyways you can this to future task and maybe work on it so that we gemini users can also use zep to full extent.

i was thinking with litellm https://github.com/BerriAI/litellm, this is awesome project for creating LLM APIs proxy server which uses the OpenAI format, so it does support gemini too so if we create api proxy server of gemini and run locally then provide its url in zep config will it work? and can you test this out and if it works it will be awesome. @danielchalef