agentcoinorg / evo.ninja

A versatile generalist agent.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Make it possible to use Local LLM(llamacpp,oobabooga) APIs to run evo.ninja

Tempaccnt opened this issue · comments

Is your feature request related to a problem? Please describe.
not everyone has access to openAI API. and even if you do have access, you're always limited by your budget. having an agent that can run completely free and offline too, will be great.

Describe the solution you'd like
try to include llamacpp or other LLM project where the user can use custom models placed inside a folder.

What you're saying seems feasible. Is there an open-source project available that handles requests using OpenAI API routes? The only concern is whether this project utilizes function calling. If it does, then we might need to include an additional parameter for function calling in the prompt. like here

I am very eager to integrate it with PR.

This thread on Polywrap's discord covers previous research around this feature

https://discord.com/channels/796821176743362611/1211733650375184466/1211753664218275840

to join the discord and read the messages https://discord.gg/k7UCsH3ps9

Let me know if there is any way i can provide guidance either through a comment here or through the discord!

@haliliceylan @Tempaccnt Happy monday!

@rihp I could not pass the captcha check on Discord. So I am writing here again.

Litellm is supporting function call actually, and its even supporting to non openai AI see:

https://docs.litellm.ai/docs/completion/function_call#using-function_to_dict-with-function-calling

Hey together, found this discussion by coincidence. I like this project, but for my use case it is too strongly coupled to the OpenAI Service. Which is why I started writing additional adapters for other LLM services. Unfortunately, the code doesn't work yet, so I can't show anything about it. But I just wanted to briefly comment on this and show that there is also interest in such feature here as well.

I found a project that seems to implement this somehow(focused on coding) using ollama, I still haven't gotten a deep look into it. but I have seen some videos of it running.

here it's:
https://github.com/stitionai/devika

I would be very interested by such a feature too !