asarigun / MixerGANsformer

Official Implementation of MixerGANsformer in PyTorch.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MixerGANsformer: Can We Built A Strong GAN with Transformers and MLP-Mixers?

Official implementation of MixerGANsformer in PyTorch. A novel GAN model which consists of Transformers and MLP-Mixer. Preprint will be published soon.

Overview

In this model, generator is the the same structure in TransGAN's generator and the discriminator is from MLP-Mixer. The goal is to create an strong GAN model without convolutions and show that the MLP-Mixer and Transformers may help to create a strong GAN instead of using pure Transformers or MLP-Mixer in GANs.

Usage

Before running train.py, check whether you have libraries in requirements.txt! To save your model during training, create ./checkpoint folder using mkdir checkpoint.

Dataset

CIFAR10

Training

python train.py

License

MIT

About

Official Implementation of MixerGANsformer in PyTorch.

License:MIT License


Languages

Language:Python 100.0%