nilsherzig / LLocalSearch

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

limit db result size to not "overflow" context

nilsherzig opened this issue · comments

Is your feature request related to a problem? Please describe.
Sometimes we get more result text back than there is model context, especially if the context is set to a low number. This results in the llm not adhering to the requested answer structure. Which results in parsing errors.

Describe the solution you'd like
Limit the amount of context we use with db / websearch results.