lks-ai / anynode

A Node for ComfyUI that does what you ask it to do

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Feature Request] Allow the loading of a local llm/T5 model within the node, instead of accessing another server.

GalaxyTimeMachine opened this issue · comments

I have several llm and T5 models, already downloaded, that can be used within a workflow to enhance prompts. Would it be possible to select one of these same models within the dropdown options of the "AnyNode"?

For example, using a loader like the T5 loader:
image

If you can run it on a server, you can... I mean, I have no idea and have never used that man. Seems out of scope.

It just runs within ComfyUI and is called once for the task, but isn't something that runs as a server.
This seems like the most efficient way to use AnyNode.