Why you don't have batchnorm for first layer of critic?
menglin0320 opened this issue · comments
Some sepcial reason?
It follows the same setting in DCGAN and PyTorch version of WGAN.
Yes. I just follows the model defined in PyTorch version of WGAN. No special reason for this. But since even MLP can do a decent job as a critic, you might want to try how it works with bn in the first layer.