sobieskibj / kbdm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Knowledge Base - Diffusion Models

Introductory materials

Papers

WR (worth reading) scale from 1 to 3 indicates how important it is to read a paper (higher means more important).

Foundations

2022

  • Classifier-Free Diffusion Guidance WR=3
    Simple but very practical idea. Instead of training an additional classifier for guidance, we can simply condition diffusion models with some signal and train them simultaneously with and without this signal.

2021

2020

  • Denoising Diffusion Implicit Models WR=3
    One of the most important papers which shows that a trained diffusion models actually approximates an entire family of objectives, together with a deterministic process (referred to as DDIM) which enables faster inference and direct mapping from image to noise and back. DDIM is used in almost every paper today. Main author is Jiaming Song, so it seems like having Song somewhere in your name makes you good at diffusion models.

  • Denoising Diffusion Probabilistic Models WR=3
    This paper revived diffusion models after a few years and made them go mainstream. Shows that diffusion models work great at practical resolutions like $256 \times 256$.

2019

2015

Representation learning

2024

2023

2022

Inverse problems

2024

2023

2022

Consistency Models

2024

  • Easy Consistency Tuning WR=3
    Great blog post that will probably be converted into a paper. It begins with an intuitive introduction to Consistency Models and proceeds with showing how the original framework can be improved by replacing distillation with fine-tuning of pretrained diffusion models.

  • Consistency Trajectory Models: Learning Probability Flow ODE Trajectory of Diffusion WR=3
    A general framework that encompasses diffusion distillation techniques and consistency models, allowing for jumps from and to arbitrary timesteps of the PF ODE.

2023

About