enjamamulhoq / TransGAN

[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up

Code used for TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up.

Implementation

  • checkpoint gradient using torch.utils.checkpoint
  • 16bit precision training
  • Distributed Training (Faster!)
  • IS/FID Evaluation
  • Gradient Accumulation
  • Stronger Data Augmentation
  • Self-Modulation

Guidance

Cifar training script

python exp/cifar_train.py

Cifar test

First download the cifar checkpoint and put it on ./cifar_checkpoint. Then run the following script.

python exp/cifar_test.py

Main Pipeline

Main Pipeline

Representative Visual Results

Cifar Visual Results Visual Results

README waits for updated

Acknowledgement

Codebase from AutoGAN, pytorch-image-models

Citation

if you find this repo is helpful, please cite

@article{jiang2021transgan,
  title={Transgan: Two pure transformers can make one strong gan, and that can scale up},
  author={Jiang, Yifan and Chang, Shiyu and Wang, Zhangyang},
  journal={Advances in Neural Information Processing Systems},
  volume={34},
  year={2021}
}

About

[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang

License:Other


Languages

Language:Python 91.4%Language:Cuda 5.0%Language:C++ 2.2%Language:Shell 1.5%