[Feature Request] Allow the loading of a local llm/T5 model within the node, instead of accessing another server.
GalaxyTimeMachine opened this issue · comments
GalaxyTimeMachine commented
LK Studio commented
If you can run it on a server, you can... I mean, I have no idea and have never used that man. Seems out of scope.
GalaxyTimeMachine commented
It just runs within ComfyUI and is called once for the task, but isn't something that runs as a server.
This seems like the most efficient way to use AnyNode.