bigscience-workshop / petals

🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading

Home Page:https://petals.dev

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support stable diffusion model

lbgws2 opened this issue · comments

commented

can i use stable diffusion model with petals?

Hi @lbgws2,

StableDiffusion usually fits into one GPU, so we recommend to use StableHorde instead (it's another volunteer computing project focused on smaller models).

We don't plan to support StableDiffusion since Petals is focused on very large models that don't fit into one consumer GPU (so we have to use pipeline parallelism).

commented

What a surprise!!
thank you very much @borzunov