YanJieWen / IGDFormer-light-up-dark

Pytorch implemention

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

IGDFormer-light-up-dark

Pytorch implemention

Dual attention Transformer based on illumination-guided for low-light image enhancement

Central south university

Model

Overall Architecture

Architecture

Getting Started

Environment

  1. Clone this repo:
git clone https://github.com/YanJieWen/IGDFormer-light-up-dark.git
cd IGDFormer-light-up-dark-master/

Data Preparation

  1. Download the dataset: [Retinexformer]
  2. Create the new root and put datasets into parired_datasets.

Training

  1. For single GPU train: Run the [train] and you can change the params follow your device.
  2. For multi GPU and single PC: Run accelerate launch --multi_gpu train_engine.py
  3. For more details, please check [HUGGING FACE]

Evaluation and demo

Make sure you have down load pretrained weights from Google, and put them into the save_weights root. Run [validation] or [demo]

Acknowledgement

Thank you for the surprise works that we follow:

  1. BasicSR
  2. Retinexformer
  3. Restormer
  4. Huggingface

About

Pytorch implemention

License:MIT License


Languages

Language:Python 100.0%