yqx7150 / JGM

Joint Intensity-Gradient Guided Generative Modeling for Colorization

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

JGM

Paper: Joint Intensity-Gradient Guided Generative Modeling for Colorization

Authors: Kai Hong, Jin Li, Wanyun Li, Cailian Yang, Minghui Zhang, Yuhao Wang, Qiegen Liu,

Date : 12/2020 Version : 1.0
The code and the algorithm are for non-comercial use only. Copyright 2020, Department of Electronic Information Engineering, Nanchang University.

This paper proposes an iterative generative model for solving the automatic colorization problem. Although pre-vious researches have shown the capability to generate plausible color, the edge color overflow and the require-ment of the reference images still exist. The starting point of the unsupervised learning in this study is the observation that the gradient map possesses latent information of the image. Therefore, the inference process of the generative modeling is conducted in joint intensity-gradient domain. Specifically, a set of intensity-gradient formed high-dimensional tensors, as the network input, are used to train a powerful noise conditional score network at the training phase. Furthermore, the joint intensity-gradient constraint in data-fidelity term is proposed to limit the de-gree of freedom within generative model at the iterative colorization stage, and it is conducive to edge-preserving. Extensive experiments demonstrate that the system outper-forms state-of-the-art methods whether in quantitative comparisons or user study.

Training

if you want to train the code, please train the code to attain the model

python3.5 JGM_main.py --jgm Train_9ch --config anneal.yml --doc your save path

Test

if you want to test the code, please

python3.5 JGM_main.py --igm Test_9ch --config anneal.yml --doc your checkpoint --test --image_folder your save path

Checkpoints

We provide pretrained checkpoints. You can download pretrained models from Baidu Drive. key number is "JGM9 "

Graphical representation

Visualization of the colorization results with different generative modelings.

Visualization of the colorization results with different generative modelings. (a) The reference grayscale image. (b) The top line is the colorization result with generative modeling in intensity domain, and the bottom line is with the generative modeling in joint intensity-gradient do-main. (c) The colorization result of the proposed JGM. Particularly, the generative modeling of JGM is conducted in 9-channel (intensity-gradient) domain, largely reduces the color ambiguity in intensity and attains a more natural and realistic result.

The visual comparison of CVAE , iGM and JGM

Visual comparison of CVAE (a), iGM (b) and JGM (c). Rather than the CVAE that optimally encode the compressible latent space to achieve the colorization goal, both iGM and JGM utilize the generative modeling in high-dimensional space to optimize the colorization target. Particularly by taking advantage of the joint intensity-gradient field, the proposed JGM learns prior information and iteratively approaches to color image.

The pipeline of the prior learning stage and the iterative colorization procedure of JGM.

The pipeline of the prior learning stage and the iterative colorization procedure of JGM. More specifically, the prior training stage learns the data distribution (including images domain and gradients domain) from the reference dataset, which acts as prior information for later colorization. The colorization stage generates samples from the high-dimensional noisy data distribution by annealed Langevin dynamics, under the given intensity-gradient data-consistency constraint.

Visualization of the intermediate colorization process with annealed Langevin dynamics.

Visualization of the intermediate colorization process with annealed Langevin dynamics. As the level of artificial noise becomes smaller, the colori-zation results tend to more natural color effects.

Visual comparisons with the state-of-the-arts.

Visual comparisons with the state-of-the-arts. From left to right: Grayscale, Zhang et al., MemoPainter, ChromaGAN, iGM and JGM. Our method with gradient domain and high-dimensional can predict pleasing colors visually.

Diversified colorization.

Diversified colorization effects of the proposed JGM.

Train Data

We choose three datasets for experiments, including LSUN(bedroom and church), COCO-stuff and ImageNet

Test Data

We randomly select 100 bedrooms and church data respectively, the size is 128x128.

Other Related Projects

About

Joint Intensity-Gradient Guided Generative Modeling for Colorization


Languages

Language:Python 100.0%