AntixK / PyTorch-VAE

A Collection of Variational Autoencoders (VAE) in PyTorch.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Possible mistake in vanilla_vae 'loss_function'

sh3rlock14 opened this issue · comments

I think that in vanilla_vae.py loss_function there's a mistake in KLD returned value:

return {'loss': loss, 'Reconstruction_Loss':recons_loss.detach(), 'KLD':-kld_loss.detach()}

the negative sign (-) should not be there!

Agree, the negative sign was already added before, so it should not be there

Agree. Line#143 already has one negative sign.

this also happens in the mssim_vae.py
line#155
`

    kld_loss = torch.mean(-0.5 * torch.sum(1 + log_var - mu ** 2 - log_var.exp(), dim = 1), dim = 0) 

    loss = recons_loss + kld_weight * kld_loss
    return {'loss': loss, 'Reconstruction_Loss':recons_loss, 'KLD':-kld_loss}`