shaoanlu / faceswap-GAN

A denoising autoencoder + adversarial losses and attention mechanisms for face swapping.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Why swap decoder during training?

zhanglonghao1992 opened this issue · comments

commented

Hi @shaoanlu , thaks for your amazing code.
I am a little confused about the training step of swapping decoders.
At about 4/5 of the training process, you do:
model.decoder_A.load_weights("models/decoder_B.h5") model.decoder_B.load_weights("models/decoder_A.h5")

Since decoder_A is used to transform a warped face into a fake face A. So why change it to decoder_B?

@zhanglonghao1992 Hi, have you figured out the reason why swapping decoders? I ran a few experiments and did not find it useful. For example, the loss for GA is higher after swapping decoders and cannot get back.