A series of samples demonstrating how to train simple samples with the lib diffusers
.
May be a lot easier to read for beginners since the complicated conditioning stuff (e.g. CLIP) is removed.
Can be a beginner's guide for the library diffusers
.
vae.py
is a simple (unconditional) variational autoencodervqvae.py
is a simple (unconditional) vector-quantized VAEpixel_diffusion.py
is a (unconditional) diffusion model that can generates images out of noise directlylatent_diffusion.py
is a (unconditional) latent diffusion model that generates latent vector, can work with eithervae.py
orvqvae.py
to produce a final image