yenchenlin / pix2pix-tensorflow

TensorFlow implementation of "Image-to-Image Translation Using Conditional Adversarial Networks".

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unbalanced training

danmic opened this issue · comments

commented

Hi,
while I trained the GAN using the facades database, I experienced a really high g_loss (around 30 @ 27 epoch) and quite low d_loss (around 5 @ 27 epoch). I am just running the code as it is, without changing anything. Is this behavior normal, or there is any issue I should tackle?

Here you can find the plots with tensorboard.

edit: Here some generated samples of various epochs.

Hi @danmic !
The same was for me. I guess it would be helpful to run also torch code and look what will be, as training "dynamic" wasn`t covered in paper

commented

@rudolphyo
Was it just at the beginning of the training, and then it started being better? I don't have a powerful GPU at the moment, so before running over ~200 epochs I would like to know if this behaviour is normal (and also why actually).

@danmic as far as I remember, I think that through whole training relation between g_loss and d_loss was pretty high.