rashadphz / farfalle

🔍 AI search engine - self-host with local or cloud LLMs

Home Page:https://www.farfalle.dev/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Can you add custom search functionality and the ability to choose different Ollama models?

Soein opened this issue · comments

Can you add the option to choose multiple models for Ollama, such as the 70B model, and also add the ability to perform custom searches, like integrating Bing search?

Other models

Just added support for all ollama models through https://www.litellm.ai/!

In your .env set CUSTOM_MODEL= any provider/model in this list https://litellm.vercel.app/docs/providers.

Bing

I also added Bing Search support. You can set it up as so in your .env:

BING_API_KEY=...
SEARCH_PROVIDER=bing