Activation function of generator and discriminator?
YongWookHa opened this issue · comments
Ha YongWook commented
Hello.
In dcgan.py
generator_model(), the model uses tanh
for not only last layer but every layer.
Original paper said, it's recommended to use ReLU for every layer except the last one.
Same difference exsist in discriminator_model as well.
Do you have some specific reason that you build these differently?
Thank you for sharing your code, though.