VictorWangShuang / REMNet

Pytorch implementation of paper "REMNet: Recurrent Evolution Memory-aware Network for Accurate Long-term Weather Radar Echo Extrapolation"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

REMNet: Recurrent Evolution Memory-Aware Network for Accurate Long-Term Weather Radar Echo Extrapolation

The official PyTorch implementation of REMNet (IEEE TGRS, 2022).

Authors: Jinrui Jing, Qian Li, Leiming Ma, Lei Chen, Lei Ding

Paper Abstract

Weather radar echo extrapolation, which predicts future echoes based on historical observations, is one of the complicated spatial–temporal sequence prediction tasks and plays a prominent role in severe convection and precipitation nowcasting. However, existing extrapolation methods mainly focus on a defective echo-motion extrapolation paradigm based on finite observational dynamics, neglecting that the actual echo sequence has a more complicated evolution process that contains both nonlinear motions and the lifecycle from initiation to decay, resulting in poor prediction precision and limited application ability. To complement this paradigm, we propose to incorporate a novel long-term evolution regularity memory (LERM) module into the network, which can memorize long-term echo-evolution regularities during training and be recalled for guiding extrapolation. Moreover, to resolve the blurry prediction problem and improve forecast accuracy, we also adopt a coarse–fine hierarchical extrapolation strategy and compositive loss function. We separate the extrapolation task into coarse and fine two levels which can reduce the downsampling loss and retain echo fine details. Except for the average reconstruction loss, we additionally employ adversarial loss and perceptual similarity loss to further improve the visual quality. Experimental results from two real radar echo datasets demonstrate the effectiveness of our methodology and show that it can accurately extrapolate the echo evolution while ensuring the echo details are realistic enough, even for the long term. Our method can further be improved in the future by integrating multimodal radar variables or introducing certain domain prior knowledge of physical mechanisms. It can also be applied to other spatial–temporal sequence prediction tasks, such as the prediction of satellite cloud images and wind field figures.

Setup

  1. PyTorch >= 1.6.0
  2. Anaconda, cuda, and cudnn are recommended
  3. Other required python libraries: PIL, torchvision, tensorboard, skimage, tqdm, xlwt, matplotlib
  4. Preparing the two radar echo datasets: HKO-7, Shanghai-2020
  5. Set the right dataset root in configs.py
  6. Python run the HKO_7_preprocessing.py and Shanghai_2020_preprocessing.py script in the ./datasets folder to preprocess the two datasets before training

Training

Run the train_hko_7.py or train_shanghai_2020.py script to train the model on two datasets respectively

$ python train_hko_7.py
$ python train_shanghai_2020.py

You can also change the default training settings in configs.py, such as the device_ids or train_batch_size. The training task can be distributed on multi-GPUs. In our work, the train batch size is set to 8 and can be distributed on 4 RTX 2080 Ti GPUs, each with 12 GB of memory.

The training log files will be stored in the logdir folder, you can open tensorboard website to visualize them.

The trained model will be saved in the checkpoints folder automatically and periodically (controlled by the model_save_fre).

Test

To test the pretrained model on test set and samples, you can run

$ python test_hko_7.py
$ python test_shanghai_2020.py

Our pretrained model can be downloaded from link HKO-7 and Shanghai-2020, please put them into the checkpoints folder to test.

We have provided serveral test samples in test_samples folder, you can also add yours following the same pattern. The test results will be saved in test_results folder.

If you only want to test the model on test samples, not on test set quantitatively, please test with test_samples_only=True.

We have provided some test samples gifs in figures folder.

Citation

When using any parts of the Project or the Paper in your work, please cite the following paper:

@InProceedings{Jing_2022_IEEE TGRS, 
  author = {Jing, Jinrui and Li, Qian and Ma, Leiming and Chen, Lei and Ding, Lei}, 
  title = {REMNet: Recurrent Evolution Memory-Aware Network for Accurate Long-Term Weather Radar Echo Extrapolation}, 
  journal = {IEEE Transactions on Geoscience and Remote Sensing (IEEE TGRS)}, 
  year = {2022},
}

About

Pytorch implementation of paper "REMNet: Recurrent Evolution Memory-aware Network for Accurate Long-term Weather Radar Echo Extrapolation"


Languages

Language:Python 100.0%