google / compare_gan

Compare GAN code.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

WGAN gradient penalty coupled with spectral normalization,generator and discriminator loss are both nan

IPNUISTlegal opened this issue · comments

commented

gradient penalty coupled with spectral normalization ,two (generator and discriminator) loss are nan.
cancel gradient penalty and just use spectral normalization in Ex∼qdata[D(x)]−Ez∼p(z)[D(G(z))],two (generator and discriminator) loss are normal.
why?cause i am a noob in deeplearning.
thx!

Should be fine using both gp and sn for discriminator, and it may work better than just sn.

commented

I use wrong sn code, just use google sn code,it's fine.

commented

@chentingpc Sorry bother your again. Spectral_norm function in the code performs Spectral Normalization on a weight tensor. Authors use it on convolution. I wonder whether spectral_norm function code can apply to deconvolution. But, the shape of w in convolution is

"w", [k_h, k_w, input_.get_shape()[-1], output_dim],

the shape of w in deconvolution is
"w", [k_h, k_w, output_shape[-1], input_.get_shape()[-1]],

It's fine to directly use spectral_norm(w) in deconvolution just the same way in convolution?
thx a lot

it is fine, power iteration of both converges to the same largest singular value of A. may save some small compute if done power iteration is put on the smaller dimensions.