MeteoSwiss / ldcast

Latent diffusion for generative precipitation nowcasting

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

question: validation loss of the trained model

tomasvanoyen opened this issue · comments

Hi @jleinonen,

in order reproduce the results, it would be nice to have an idea about the val_loss_ema which leads to the weights included in the zenodo repo?

In particular, I tried making predictions with a model that had about val_loss_ema of 0.00753 which doesn't show any great features.

Thanks and regards,

Tomas

Hi @jleinonen, and others interested,

using the included model weights, I obtain a val_loss_ema of 0.00349 with a sample shape of (4,4); and a val_loss_ema of 0.0022 with a sample shape of (8,8).

After extensive training (without finetuning on dataset with shape (8,8)); I managed to obtain a val_loss_ema of 0.0036 for the (4,4) case, and a val_loss_ema of 0.0023 for the (8,8) case.

I believe this comes close ...

Regards,

Tomas

Hi Tomas, thanks for looking into this. Are you getting good results with the model you trained yourself?

By the way - unfortunately I found that sometimes a good val_loss_ema doesn't necessarily indicate great generated samples...

Hi @jleinonen ,

with a model that wasn't further finetuned on the a larger state (val_loss_ema of 0.0036); I didn't obtain reasonable features.

After finetuning the model on a larger domain (using state = (6,6) as a larger domain didn't fit the memory of the a100 I am using), I reached up to a val_loss_ema of 0.0023; and the features appear to be reasonble.

The video with predictions similar to that is included can be found temporarily here: https://we.tl/t-BLQnYPvhUQ (drop me a message at tomas.vanoyen@prophesea.eu if the link is deprecated and you would like to see the video).

It appears hence to me that training on the larger domain is crucial to obtain realistic predictions.

Best regards,

Tomas