Training using MSE than finetuning using MS-SSIM but aux loss goes bigger
gitlabspy opened this issue · comments
Following https://interdigitalinc.github.io/CompressAI/zoo.html#training, I first train with mse with
aux_loss
being large on its own isn't a problem. Typical values I have observed are aux_loss ~= 2 * batch_size
, but this varies depending on factors such as the number of channels, and the batch-to-batch stability of the EntropyBottleneck
distributions. But if you want to reduce it to 0, you can try #231.
Related: