YiqunChen1999 / HazeRemovalList

list of dehazing papers: https://yiqunchen1999.github.io/HazeRemovalList/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

HazeRemovalList

A list of haze removal task and corresponding benchmark datasets. If you find some papers or datasets are missing, please feel free to open an issue, I will add them to the list. Thank you for your attention.

Content

Papers

Datasets

Papers

2021

[Paper, Code] Yi, Xin, et al. "Two-Step Image Dehazing with Intra-domain and Inter-domain Adaption." arXiv preprint arXiv:2102.03501 (2021).

[Paper, Code] Liu, Huan, Chen Wang, and Jun Chen. "Indirect Domain Shift for Single Image Dehazing." arXiv preprint arXiv:2102.03268 (2021).

[Paper, Code] Shyam, Pranjay, Kuk-Jin Yoon, and Kyung-Soo Kim. "Towards Domain Invariant Single Image Dehazing." arXiv preprint arXiv:2101.10449 (2021).

2020

[Paper, Code] Ren, Wenqi, et al. "Single image dehazing via multi-scale convolutional neural networks with holistic edges." International Journal of Computer Vision 128.1 (2020): 240-259.

[Paper, Code] Li, Ruoteng, et al. "Learning to Dehaze From Realistic Scene with A Fast Physics Based Dehazing Network." arXiv preprint arXiv:2004.08554 (2020).

[Paper, Code] Das, Sourya Dipta, and Saikat Dutta. "Fast deep multi-patch hierarchical network for nonhomogeneous image dehazing." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. 2020.

[Paper, Code] Li, Boyun, et al. "You only look yourself: Unsupervised and untrained single image dehazing neural network." International Journal of Computer Vision (2021): 1-14.

[Paper, Code] Liu, Chen, Jiaqi Fan, and Guosheng Yin. "Efficient Unpaired Image Dehazing with Cyclic Perceptual-Depth Supervision." arXiv preprint arXiv:2007.05220 (2020).

[Paper, Code] Shen, Jiawei, et al. "Implicit Euler ODE Networks for Single-Image Dehazing." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. 2020.

[Paper, Code] Zhang, Jing, et al. "Nighttime dehazing with a synthetic benchmark." Proceedings of the 28th ACM International Conference on Multimedia. 2020.

[Paper, Code] Anvari, Z. and Athitsos, V., “Dehaze-GLCGAN: Unpaired Single Image De-hazing via Adversarial Training”, arXiv e-prints, 2020.

[Paper, Code] Dhara, Sobhan Kanti, et al. "Color cast dependent image dehazing via adaptive airlight refinement and non-linear color balancing." IEEE Transactions on Circuits and Systems for Video Technology (2020).

[Paper, Code] Singh, Ayush, Ajay Bhave, and Dilip K. Prasad. "Single image dehazing for a variety of haze scenarios using back projected pyramid network." European Conference on Computer Vision. Springer, Cham, 2020.

[Paper, Code] Shao, Yuanjie, et al. "Domain adaptation for image dehazing." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.

[Paper, Code] Dudhane, Akshay, et al. "Varicolored image de-hazing." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.

[Paper, Code] Pang, Yanwei, et al. "BidNet: Binocular image dehazing without explicit disparity estimation." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.

[Paper, Code] Mehta, Aditya, et al. "Hidegan: A hyperspectral-guided image dehazing gan." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. 2020.

[Paper, Code] Hong, Ming, et al. "Distilling Image Dehazing With Heterogeneous Task Imitation." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.

[Paper, Code] Dong, Hang, et al. "Multi-scale boosted dehazing network with dense feature fusion." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.

[Paper, Code] Dong, Yu, et al. "FD-GAN: Generative adversarial networks with fusion-discriminator for single image dehazing." Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 34. No. 07. 2020.

[Paper, Code] Wu, Haiyan, et al. "Knowledge transfer dehazing network for nonhomogeneous dehazing." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. 2020.

[Paper, Code] Deng, Qili, et al. "HardGAN: A Haze-Aware Representation Distillation GAN for Single Image Dehazing." European Conference on Computer Vision. Springer, Cham, 2020.

[Paper, Code] Qin, Xu, et al. "Ffa-net: Feature fusion attention network for single image dehazing." Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 34. No. 07. 2020.

[Paper, Code] Liu, Jing, et al. "Trident dehazing network." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. 2020.

[Paper, Code] Dong, Jiangxin, and Jinshan Pan. "Physics-Based Feature Dehazing Networks." European Conference on Computer Vision. Springer, Cham, 2020.

[Paper, Code] Kar, Aupendu, et al. "Transmission Map and Atmospheric Light Guided Iterative Updater Network for Single Image Dehazing." arXiv preprint arXiv:2008.01701 (2020).

2019

[Paper, Code] Chen, Dongdong, et al. "Gated context aggregation network for image dehazing and deraining." 2019 IEEE winter conference on applications of computer vision (WACV). IEEE, 2019.

[Paper, Code] Chen, Wei-Ting, Jian-Jiun Ding, and Sy-Yen Kuo. "PMS-net: Robust haze removal based on patch map for single images." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2019.

[Paper, Code] Qu, Yanyun, et al. "Enhanced pix2pix dehazing network." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2019.

[Paper, Code] Liu, Xiaohong, et al. "Griddehazenet: Attention-based multi-scale network for image dehazing." Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019.

[Paper, Code] Li, Yunan, et al. "LAP-net: Level-aware progressive network for image dehazing." Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019.

[Paper, Code] Liu, Yang, et al. "Learning deep priors for image dehazing." Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019.

[Paper, Code] Deng, Zijun, et al. "Deep multi-model fusion for single-image dehazing." Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019.

2018

[Paper, Code] Cheng, Ziang, et al. "Semantic single-image dehazing." arXiv preprint arXiv:1804.05624 (2018).

[Paper, Code] Ren, Wenqi, et al. "Gated fusion network for single image dehazing." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018.

[Paper, Code] Li, Runde, et al. "Single image dehazing via conditional generative adversarial network." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018.

[Paper, Code] Cho, Younggun, Jinyong Jeong, and Ayoung Kim. "Model-assisted multiband fusion for single image enhancement and applications to robot vision." IEEE Robotics and Automation Letters 3.4 (2018): 2822-2829.

[Paper, Code] Zhang, He, and Vishal M. Patel. "Densely connected pyramid dehazing network." Proceedings of the IEEE conference on computer vision and pattern recognition. 2018.

Before

[Paper, Code] Song, Yafei, et al. "Single image dehazing using ranking convolutional neural network." IEEE Transactions on Multimedia 20.6 (2017): 1548-1560.

[Paper, Code] Li, Boyi, et al. "Aod-net: All-in-one dehazing network." Proceedings of the IEEE international conference on computer vision. 2017.

Datasets

RESIDE

[Paper, Link] RESIDE is a new large-scale benchmark consisting of both synthetic and real-world hazy images, called REalistic Single Image DEhazing (RESIDE). RESIDE highlights diverse data sources and image contents, and is divided into five subsets (ITS, OTS, SOTS, RTTS, HSTS), each serving different training or evaluation purposes.

There are three versions of RESIDE: RESIDE-V0, RESIDE-Standard, RESIDE-beta.

ITS (Indoor Training Set): synthetic data.

OTS (Out Training Set): synthetic data.

SOTS (Synthetic Object Testing Set): synthetic data.

HSTS (Hybrid Subjective Testing Set): real data.

RTTS (Real-world Task-driven Testing Set): real data.

I-HAZE & O-HAZE (dataset for NTIRE 2018 dehazing competition)

[Paper, Link] I-HAZE (real data): a dataset that contains 35 image pairs of hazy and corresponding haze-free (ground-truth) indoor images. Different from most of the existing dehazing databases, hazy images have been generated using real haze produced by a professional haze machine. To ease color calibration and improve the assessment of dehazing algorithms, each scene includes a MacBeth color checker. Moreover, since the images are captured in a controlled environment, both haze-free and hazy images are captured under the same illumination conditions.

[Paper, Link] O-HAZE (real data): the first outdoor scenes database (named O-HAZE) composed of pairs of real hazy and corresponding haze-free images. In practice, hazy images have been captured in presence of real haze, generated by professional haze machines, and O-HAZE contains 45 different outdoor scenes depicting the same visual content recorded in haze-free and hazy conditions, under the same illumination parameters.

Dense-Haze (dataset for NTIRE 2019 dehazing competition)

[Paper, Link] Dense-Haze (real data): Characterized by dense and homogeneous hazy scenes, Dense-Haze contains 33 pairs of real hazy and corresponding haze-free images of various outdoor scenes. The hazy scenes have been recorded by introducing real haze, generated by professional haze machines. The hazy and haze-free corresponding scenes contain the same visual content captured under the same illumination parameters. Dense-Haze dataset aims to push significantly the state-of-the-art in single-image dehazing by promoting robust methods for real and various hazy scenes.

NH-HAZE (dataset for NTIRE 2020 dehazing competition)

[Paper, Link] NH-HAZE (real data): a non-homogeneous realistic dataset with pairs of real hazy and corresponding haze-free images. This is the first non-homogeneous image dehazing dataset and contains 55 outdoor scenes. The non-homogeneous haze has been introduced in the scene using a professional haze generator that imitates the real conditions of hazy scenes.

Dataset for NTIRE 2021 Dehazing Competition

[Paper, Link] Dataset (real data): dataset for NTIRE 2021 dehazing competition.

D-Hazy

[Paper, Link] D-Hazy (synthetic data): D-HAZY is built on the Middelbury and NYU Depth datasets that provide images of various scenes and their corresponding depth maps.

BeDDE

[Paper, Link] BeDDE (real data): BeDDE (read as /ˈbedi/) is a real-world benchmark dataset for evaluations of dehazing methods. It consists of 208 pairs of hazy images and clear refernece images. For each pair, a manually labelled mask is provided to delineate regions with the same contents.

exBeDDE

[Paper, Link] exBeDDE (real data): exBeDDE is an extension of BeDDE, designed to measure the performance of dehazing evaluation metrics. It contains 167 hazy images and 1670 dehazed images with mean opinion scores labeled by people. Its hazy images come from BeDDE, and the dehazed images are generated by 10 dehazing methods.

Fattal

[Paper, Link] Fattal

About

list of dehazing papers: https://yiqunchen1999.github.io/HazeRemovalList/

License:MIT License