NVlabs / FUNIT

Translate images to unseen domains in the test time with few example images.

Home Page:https://nvlabs.github.io/FUNIT/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

G acc: 0.0000

hujinsen opened this issue · comments

commented

Elapsed time in update: 1.213263
Iteration: 00065911/00100000
D acc: 0.9995 G acc: 0.0000
Elapsed time in update: 1.134419
Iteration: 00065912/00100000
D acc: 0.9999 G acc: 0.0000
Elapsed time in update: 1.168156
Iteration: 00065913/00100000
D acc: 1.0000 G acc: 0.0000
Elapsed time in update: 1.190268
Iteration: 00065914/00100000
D acc: 0.9999 G acc: 0.0000
Elapsed time in update: 1.129889
Iteration: 00065915/00100000
D acc: 1.0000 G acc: 0.0000
Elapsed time in update: 1.128156
Iteration: 00065916/00100000
D acc: 0.9999 G acc: 0.0000
Elapsed time in update: 1.135906
Iteration: 00065917/00100000
D acc: 1.0000 G acc: 0.0000
Elapsed time in update: 1.108964
Iteration: 00065918/00100000

G acc is 0 , is this normal?

same problem. my D performs too good to train G.
I recommend you to:

  1. decrease D's lr

  2. run K iters on G and 1 iter on D.

After doing these, my results got better but still not satisfactory.
And the original code use same lr and iter, I wonder why it can succeed

@MarStarck Hi, In GAN theory , the D is measure real distribution and fake distribution. So, it not make sense about run K iters on G and 1 iter on D. Contrary , you should run K iters on D and 1 iter on G

@Johnson-yue yes, you are right. But in my experiment, this operation indeed improves performance when G acc is 0.
It's really strange because in theory calc_grad2 term can ensure no gradient vanishing problem in my understanding.

yes, but just in theory or it depends on your dataset,I want to know does it improves generator real data??