MeteoSwiss / ldcast

Latent diffusion for generative precipitation nowcasting

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The parameter sample_shape does not work when fine-tune the generative model using 256x256 pixel samples

bugsuse opened this issue · comments

Hi, I was running train_genforecast.py to fine-tune the model using 256x256 pixel samples. The model was initialized with the weights obtained the pre-training with 128x128 pixel samples.

I found that the parameter sample_shape of the train function in train_genforecast.py was not used when fine-tuning the model.

That is, the sample_shape is still (4, 4) when building datamodule for fine-tuning the model.

datamodule = setup_data(
future_timesteps=future_timesteps, use_obs=use_obs, use_nwp=use_nwp,
sampler_file=sampler_file, batch_size=batch_size
)

Would you mind checking this for me?

Thanks for the report again! This is fixed in the above commit.

I also double checked that this was not affecting the training runs for the paper (for instance, the increase in time per training sample for the fine-tuning phase is consistent with the increase in sample size). Rather this parameter was omitted when building the scripts for the code release.