SSGAN missed Self-Modulation Normalization?
shimopino opened this issue · comments
Hi! Thank you for creating this easy-to-use library!
I was using your implementation of SSGAN to reproduce the original paper https://arxiv.org/abs/1810.01365.
As I read the original paper, the highest performing model was the one that used Self-Modulation Normalization, not Batch Normalization like your GBlock https://github.com/kwotsin/mimicry/blob/master/torch_mimicry/modules/resblocks.py#L73.
Do you have any plans to add Self-Modulation Normalization?
Hi @KeisukeShimokawa, yes indeed I mostly compared to the version without sBN to keep the design simple as sBN is a general architectural improvement, but indeed it will be an interesting addition to have. Since it's suitable for most unconditional GAN I think it should be a good addition. Will keep this issue open and update in the future!
Excellent! Thanks!