Possible mistake in vanilla_vae 'loss_function'
sh3rlock14 opened this issue · comments
Mattia Capparella commented
I think that in vanilla_vae.py loss_function
there's a mistake in KLD returned value:
return {'loss': loss, 'Reconstruction_Loss':recons_loss.detach(), 'KLD':-kld_loss.detach()}
the negative sign (-) should not be there!
David Valencia commented
Agree, the negative sign was already added before, so it should not be there
Victor Liu commented
Agree. Line#143 already has one negative sign.
Maximusprime3 commented
this also happens in the mssim_vae.py
line#155
`
kld_loss = torch.mean(-0.5 * torch.sum(1 + log_var - mu ** 2 - log_var.exp(), dim = 1), dim = 0)
loss = recons_loss + kld_weight * kld_loss
return {'loss': loss, 'Reconstruction_Loss':recons_loss, 'KLD':-kld_loss}`