loss is nan
weimisa341 opened this issue · comments
when I use another dataset to train you model, I found that loss will become nan. I can use my dataloader to train other model in normal state. If I use mse loss to train your model, it can be trained. Could you please tell me your idea about this problem? Could you please tell me your normalized range? I think it maybe a factor.
Hey, as you realise I cannot possibly find the bug without looking at your code and your data. But I would probably guess that there is an all zeros vector somewhere that causes the log to overflow
Hey, as you realise I cannot possibly find the bug without looking at your code and your data. But I would probably guess that there is an all zeros vector somewhere that causes the log to overflow
Thanks for your help