ICSResearch / TCS-Net

From Patch to Pixel: A Transformer-based Hierarchical Framework for Compressive Image Sensing

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TCS-Net

This repository is the pytorch code for paper "From Patch to Pixel: A Transformer-based Hierarchical Framework for Compressive Image Sensing".

1. Introduction

1) Datasets

Training set: BSDS500, testing sets: McM18, LIVE29, General100 and OST300.

2)Project structure

(TCS-Net)
|-dataset
|    |-train  
|        |-BSDS500 (.jpg)  
|    |-test  
|        |-McM18  
|        |-LIVE29  
|        |-General100  
|        |-OST300  
|-reconstructed_images
|    |-McM18
|        |-grey
|            |-... (Testing results .png)
|        |-rgb
|            |-... (Testing results .png)
|    |-... (Testing sets)
|    |-Res_(...).txt
|-models
|    |-__init__.py  
|    |-net.py  
|    |-modules.py  
|-trained_models  
|    |-1  
|    |-4  
|    |-... (Sampling rates)
|-config 
|    |-__init__.py  
|    |-config.py  
|    |-loader.py  
|-test.py  
|-train.py
|-train.sh

3) Competting methods

Methods Sources Year
ReconNet Conf. Comput. Vis. Pattern Recog. 2016
LDIT Proc. Adv. Neural Inf. Process. Syst. 2017
LDAMP Proc. Adv. Neural Inf. Process. Syst. 2017
ISTA-Net (plus) Conf. Comput. Vis. Pattern Recog. 2018
CSGAN Proc. Int. Conf. Mach. Learn. 2019
CSNet (plus) Trans. Image Process. 2020
AMP-Net Trans. Image Process. 2021
CSformer arXiv 2022

4) Performance demonstrates

Visual comparisons of reconstruction images (original images are drawn from dataset LIVE29):

2. Useage

1) Re-training TCS-Net.

  • Put the BSDS500 and VOC2012 images into ./dataset/train/.
  • e.g., If you want to train TCS-Net at sampling rate τ = 0.1 with GPU No.0, please run the following command. The train set will be automatically packaged and our model will be trained with its default parameters (please make sure you have enough GPU RAM):
python train.py --rate 0.1 --GPU 0
  • You can also run our shell script directly as well, it will automatically train the model under all sampling rates, i.e., τ ∈ {0.01, 0.04, 0.1, 0.25}:
sh train.sh
  • The trained models (.pth) will save in the trained_models folder.

2) Testing TCS-Net.

  • We provide the trained models so that you can put them under TCS-Net/trained_models/ and use them for testing directly; all trained TCS-Net models can be found in this GoogleDrive link; Please note that the folder's names are the 100 times of sampling rates, e.g., the folder named 10 includes trained models at sampling rate = 0.1.

  • Put the testing folders into ./dataset/test/.

  • e.g., if you want to test TCS-Net at sampling rate τ = 0.1 with GPU No.0, please run:

python test.py --rate 0.1 --GPU 0
  • After that, the reconstructed images, PSNR and SSIM results will be saved to ./reconstructed_images/.

End

We appreciate your reading and attention. For more details about TCS-Net, please refer to our paper.

About

From Patch to Pixel: A Transformer-based Hierarchical Framework for Compressive Image Sensing

License:MIT License


Languages

Language:Python 99.2%Language:Shell 0.8%