nilsherzig / LLocalSearch

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Please add your ideas - infrastructure planing thread

nilsherzig opened this issue · comments

Please comment if you have any thoughts on this:

We have a "chat layer" which has a history of the user's prompts and messages (green and purple). If needed, this chat chain can call other chains, which run autonomous without user interaction by using a self critique loop.

It would be very easy to add more chains to this, like a "programming chain" using deepseek-coder as the LLM.

I would have to come up with a config format for these chains, but essentially they are just a couple of conditions and strings in / strings out.

infra drawio

Aren’t GitHub Discussions better suited for this?