Loss decrease for a few and sudden explodes to inf
angusxu458 opened this issue · comments
Hi Nils, i encounted a problem while training the DTLN-aec, when i train only on the real data it stable at the loss 0.03.
However when i trained on both real and synthetic data, the loss derease for some batch and explode to inf at a sudden.
i thought it is because the log
in the snr. Whether or not it is normal? Or will the loss to be stable at a finite number after several epochs.
@anguschowchowxu, I have been encountering the same issue. Have you come up with its solution?
@MuhammadurRehman19058 In synthetic near_end speech data, some of them with no noise will be an exact zeros-array, it raise inf when computing SNR