ngcthuong / CSNet

Reimplementation of CSNet (Deep network for compressed image sensing, ICME17)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CSNet

This is reimplemenation of CSNet [1] for block based compressive sensing reconstruction. CSNet is implemented in Matconvnet. This implement is motivated by DnCNN implementation [2]

Update 2019/06

The code has been fixed, it now can produce better quality than CSNet (i.e. 32.40 PSNR at subrate 0.1 for Set14). The main reason is author converts RGB images to YCbCr and takes the Y channel, while in previous implementation I used rgb2gray for converting.

CSNet_v03 contains up-to-date implementation.

Current Performance | PSNR (dB)

GSR CSNet[1] ReImp. Best
Image Rate PSRN SSIM PSNR SSIM PSNR SSIM PSNR SSIM
baby 0.1 32.18 0.8832 34.83 0.9170 33.36 0.902 33.75 0.907
bird 0.1 34.47 0.9411 35.15 0.9476 33.05 0.931 34.47 0.949
butter 0.1 23.78 0.8279 28.01 0.9018 25.71 0.859 27.53 0.914
Avg 30.14 0.8841 32.66 0.9221 30.71 0.897 31.91 0.923

How to run

In order to train the CSNet from the scratch, you should run

  1. 'GenerateTrainingPatches.m' first. It will create trainding data outsize of this CSNet folder (for 100Mb limitation of github).

  2. TrainingCode/CSNet_v03/Demo_Train.m Training data is saved in "data/CSNet_rblk<block_size>mBat<no_mini_batch_size>"

Disclaimer

Due to some parameters are not mentioned in [1], I try my best to reproduce the resported results, by evaluating several parameter. However, the re-implementation results (PSNR - dB) are still 1~2dB lower than reported results.

If you find the better configurations, or any suggestion. Feeling free to recommend me.

Reference

[1] S. Wuzhen et al, “Deep network for compressed image sensing.� IEEE Inter. Conf. Multimedia Expo, Jul-2017.

[2] K. Zhang et al, Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising, available at https://github.com/cszn/DnCNN

About

Reimplementation of CSNet (Deep network for compressed image sensing, ICME17)


Languages

Language:MATLAB 98.5%Language:M 1.5%