Piyushi-0 / NeuralOptimalTransport

PyTorch implementation of "Neural Optimal Transport" (ICLR 2023)

Home Page:https://openreview.net/forum?id=d8CBRlWNkqH

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Neural Optimal Transport (NOT)

This is the official Python implementation of the ICLR 2023 spotlight paper Neural Optimal Transport (NOT paper on openreview) by Alexander Korotin, Daniil Selikhanovych and Evgeny Burnaev.

The repository contains reproducible PyTorch source code for computing optimal transport (OT) maps and plans for strong and weak transport costs in high dimensions with neural networks. Examples are provided for toy problems (1D, 2D) and for the unpaired image-to-image translation task for various pairs of datasets.

Presentations

Seminars and Materials

Related repositories

Citation

@inproceedings{
    korotin2023neural,
    title={Neural Optimal Transport},
    author={Korotin, Alexander and Selikhanovych, Daniil and Burnaev, Evgeny},
    booktitle={International Conference on Learning Representations},
    year={2023},
    url={https://openreview.net/forum?id=d8CBRlWNkqH}
}

Application to Unpaired Image-to-Image Translation Task

The unpaired domain translation task can be posed as an OT problem. Our NOT algorithm is applicable here. It searches for a transport map with the minimal transport cost (we use $\ell^{2}$), i.e., it naturally aims to preserve certain image attributes during the translation.

Compared to the popular image-to-image translation models based on GANs or diffusion models, our method provides the following key advantages

  • controlable amount of diversity in generated samples (without any duct tape or heuristics);
  • better interpretability of the learned map.

Qualitative examples are shown below for various pairs of datasets (at resolutions $128\times 128$ and $64\times 64$).

One-to-one translation, strong OT

We show unpaired translition with NOT with the strong quadratic cost on outdoor → church, celeba (female) → anime, shoes → handbags, handbags → shoes, male → female, celeba (female) → anime, anime → shoes, anime → celeba (female) dataset pairs.

One-to-many translation, weak OT

We show unpaired translition with NOT with the $\gamma$-weak quadratic cost on handbags → shoes, celeba (female) → anime, outdoor → church, anime → shoes, shoes → handbags, anime → celeba (female) dataset pairs.

Controlling the amount of diversity

Our method offers a single parameter $\gamma\in[0,+\infty)$ in the weak quadratic cost to control the amount of diversity.

Repository structure

The implementation is GPU-based with the multi-GPU support. Tested with torch== 1.9.0 and 1-4 Tesla V100.

All the experiments are issued in the form of pretty self-explanatory jupyter notebooks (notebooks/). For convenience, the majority of the evaluation output is preserved. Auxilary source code is moved to .py modules (src/).

  • notebooks/NOT_toy_1D.ipynb - toy experiments in 1D (weak costs);
  • notebooks/NOT_toy_2D.ipynb - toy experiments in 2D (weak costs);
  • notebooks/NOT_training_strong.ipynb - unpaired image-to-image translation (one-to-one, strong costs);
  • notebooks/NOT_training_weak.ipynb - unpaired image-to-image translation (one-to-many, weak costs);
  • notebooks/NOT_plots.ipynb - plotting the translation results (pre-trained models are needed);
  • stats/compute_stats.ipynb - pre-compute InceptionV3 statistics to speed up test FID computation;

Datasets

The dataloaders can be created by load_dataset function from src/tools.py. The latter four datasets get loaded directly to RAM.

Credits

About

PyTorch implementation of "Neural Optimal Transport" (ICLR 2023)

https://openreview.net/forum?id=d8CBRlWNkqH

License:MIT License


Languages

Language:Jupyter Notebook 99.3%Language:Python 0.7%