lks-ai / anynode

A Node for ComfyUI that does what you ask it to do

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Local .gguf model support

MilitantHitchhiker opened this issue · comments

Wondering if it's possible to add direct support for local .gguf llm models.

If you can load gguf on ollama or vLLM, then definitely. AnyNode doesn't load models, it calls the API endpoint for your local LLM host systems like ollama, LM Studio and vLLM. If they support GGUF, you can use a GGUF model in AnyNode by pointing it correctly.