Run on networked nodes
ExtraE113 opened this issue · comments
Thanks for open-sourcing this! Because the GPU ram requirements are so high, it's hard to rent a large enough single node from any of the major cloud providers. How can you run it in inference mode networked between multiple physical machines?
Thanks!