icandle / DDistill-SR

DDistill-SR: Reparameterized Dynamic Distillation Network for Lightweight Image Super-Resolution (TMM 2023)

Home Page:https://doi.org/10.1109/TMM.2022.3219646

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DDistill-SR: Reparameterized Dynamic Distillation Network for Lightweight Image Super-Resolution

Yan Wang1, Tongtong Su1, Yusen Li1†, Jiuwen Cao2, Gang Wang1, Xiaoguang Liu1

1Nankai University, 2Hangzhou Dianzi University

Abstract: Recent research on deep convolutional neural networks (CNNs) has provided a significant performance boost on efficient super-resolution (SR) tasks by trading off the performance and applicability. However, most existing methods focus on subtracting feature processing consumption to reduce the parameters and calculations without refining the immediate features, which leads to inadequate information in the restoration. In this paper, we propose a lightweight network termed DDistill-SR, which significantly improves the SR quality by capturing and reusing more helpful information in a static-dynamic feature distillation manner. Specifically, we propose a plug-in reparameterized dynamic unit (RDU) to promote the performance and inference cost trade-off. During the training phase, the RDU learns to linearly combine multiple reparameterizable blocks by analyzing varied input statistics to enhance layer-level representation. In the inference phase, the RDU is equally converted to simple dynamic convolutions that explicitly capture robust dynamic and static feature maps. Then, the information distillation block is constructed by several RDUs to enforce hierarchical refinement and selective fusion of spatial context information. Furthermore, we propose a dynamic distillation fusion (DDF) module to enable dynamic signals aggregation and communication between hierarchical modules to further improve performance. Empirical results show that our DDistill-SR outperforms the baselines and achieves state-of-the-art results on most super-resolution domains with much fewer parameters and less computational overhead.

This repository contains PyTorch implementation for DDistill-SR (TMM 2023).


Training and Testing

  • Training with EDSR framework or BasicSR framework.

    • Download training data (800 + 2650 images) from DIV2K and Flickr2K.
    • Prepare LR-HR pairs with BI, BN, and DN methods.
  • Testing with five commenly used datasets.

    • Set5: Bevilacqua et al. BMVC 2012.
    • Set14: Zeyde et al. LNCS 2010.
    • B100: Martin et al. ICCV 2001.
    • Urban100: Huang et al. CVPR 2015.
    • Manga109: Matsui et al. MTA.

Results and Models

Acknowledgement

Our RDU is based on existing dynamic and reparameterized methods, thanks for their enlightening work!

Citation

@ARTICLE{9939085,
  author={Wang, Yan and Su, Tongtong and Li, Yusen and Cao, Jiuwen and Wang, Gang and Liu, Xiaoguang},
  journal={IEEE Transactions on Multimedia}, 
  title={DDistill-SR: Reparameterized Dynamic Distillation Network for Lightweight Image Super-Resolution}, 
  year={2022},
  pages={1-13},
  doi={10.1109/TMM.2022.3219646}}

About

DDistill-SR: Reparameterized Dynamic Distillation Network for Lightweight Image Super-Resolution (TMM 2023)

https://doi.org/10.1109/TMM.2022.3219646

License:Apache License 2.0


Languages

Language:Python 96.2%Language:MATLAB 3.8%