vanna-ai / vanna

🤖 Chat with your SQL database 📊. Accurate Text-to-SQL Generation via LLMs using RAG 🔄.

Home Page:https://vanna.ai/docs/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How access local LLM without Ollama?

mobguang opened this issue · comments

Describe the bug
I cannot find documentation for explain how to access local LLM without Ollama.

To Reproduce
I have downloaded CodeLlama-7b-Instruct-hf on my Mac, but there is no example on Vanna.ai documentation to describe how to use this kind of LLM.

Expected behavior
May I know is there any way to access this kind of LLM? Thanks in advance.

Error logs/Screenshots
If applicable, add logs/screenshots to give more information about the issue.

Desktop (please complete the following information where):

  • OS: Macos
  • Version: 14.4.1
  • Python: 3.11
  • Vanna: 0.5.4

Additional context
Add any other context about the problem here.