Wentong-DST / attn-gan

Pytorch implementation of paper: AttnGAN Fine-Grained Text to Image Generation with Attentional Generative Adversarial Networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AttnGAN

Pytorch implementation for reproducing AttnGAN results in the paper AttnGAN: Fine-Grained Text to Image Generation with Attentional Generative Adversarial Networks.

Dependencies

python 2.7

Pytorch

In addition, please add the project folder to PYTHONPATH and pip install the following packages:

  • python-dateutil, easydict, pandas, torchfile, nltk, scikit-image

Data

  • Download metadata (text and filename):

    • Download preprocessed metadata for coco and save them to data/
    • Extract caption files in ''train2014-text.zip'' and ''val2014-text.zip'' to data/coco/text/
    • [Optional] If you want to use the per-trained models, please download the dictionary, captions.pickle, otherwise it will be generated by pretrain_DAMSM.py.
  • Download images:

    • Download coco dataset and extract both train2014 and val2014 images to data/coco/images/

Training

  • Pre-train DAMSM models:

    • For coco dataset: python pretrain_DAMSM.py --cfg cfg/DAMSM/coco.yml --gpu 0
  • Train AttnGAN models:

    • For coco dataset: python main.py --cfg cfg/coco_attn2.yml --gpu 0
  • *.yml files are example configuration files for training/evaluation our models.

Pretrained Model

Sampling

  • Run python main.py --cfg cfg/eval_coco.yml --gpu 1 to generate examples from captions in files listed in "./data/coco/example_filenames.txt". Results are saved to DAMSMencoders/.
  • Change the eval_*.yml files to generate images from other pre-trained models.
  • Input your own sentence in "./data/coco/example_captions.txt" if you wannt to generate images from customized sentences.

Validation

  • To generate images for all captions in the validation dataset, change B_VALIDATION to True in the eval_*.yml. and then run python main.py --cfg cfg/eval_coco.yml --gpu 1
  • We compute inception score for models trained on coco using improved-gan/inception_score.

Examples generated by AttnGAN [Blog]

bird example coco example

Citing AttnGAN

If you find AttnGAN useful in your research, please consider citing:

@article{Tao18attngan,
  author    = {Tao Xu, Pengchuan Zhang, Qiuyuan Huang, Han Zhang, Zhe Gan, Xiaolei Huang, Xiaodong He},
  title     = {AttnGAN: Fine-Grained Text to Image Generation with Attentional Generative Adversarial Networks},
  Year = {2018},
  booktitle = {{CVPR}}
}

About

Pytorch implementation of paper: AttnGAN Fine-Grained Text to Image Generation with Attentional Generative Adversarial Networks

License:MIT License


Languages

Language:Python 100.0%