Local .gguf model support
MilitantHitchhiker opened this issue · comments
MilitantHitchhiker commented
Wondering if it's possible to add direct support for local .gguf llm models.
LK Studio commented
If you can load gguf on ollama or vLLM, then definitely. AnyNode doesn't load models, it calls the API endpoint for your local LLM host systems like ollama, LM Studio and vLLM. If they support GGUF, you can use a GGUF model in AnyNode by pointing it correctly.