If you want to show original Pytorch
Fast-Neural-Style
Exmpale, this is the Original Repo
๊ธฐ์กด์ Pytorch
example ๋ ํฌ์งํฐ๋ฆฌ์ fast_neural_style๋ Perceptual Losses for Real-Time Style Transfer and Super-Resolution๋ฅผ ๊ตฌํํด๋์์ต๋๋ค. ๋ค๋ง CLI ํ๊ฒฝ์์ ์ฌ์ฉ๋๊ณ , argparse
๋ก ์ธ์๋ค์ ๋๊ฒจ์ฃผ๊ธฐ ๋๋ฌธ์, ์ธ์์ ์์ ์ด๋ ์ฌ์ฉ์ฑ ์ธก๋ฉด์์ ๋ถํธํจ์ ๋๊ผ์ต๋๋ค.
๊ทธ๋์ ์ฝ๋๋ฅผ Python์ Package๋ก ๋ฆฌํฉํ ๋ง ํ์์ผ๋ฉฐ, ์ธ์๋ฅผ ๋ฐ๋ HyperParameter
ํด๋์ค๋ ์ถ๊ฐํ์์ต๋๋ค.
This repository contains a pytorch implementation of an algorithm for artistic style transfer. The algorithm can be used to mix the content of an image with the style of another image. For example, here is a photograph of a door arch rendered in the style of a stained glass painting.
The model uses the method described in Perceptual Losses for Real-Time Style Transfer and Super-Resolution along with Instance Normalization. The saved-models for examples shown in the README can be downloaded from here.
class HyperParameter:
def __init__(self, command, cuda, param_dict):
# Code
def set_train_parameter(self, param_dict):
# Code
def set_eval_parameter(self, param_dict):
# Code
__init__
: ์ด๋ค ํจ์๋ฅผ ์ฐ๋์ง, GPU๋ฅผ ์ฌ์ฉํ๋์ง ๊ทธ๋ฆฌ๊ณ ์ธ์๋ค์ dictionary๋ฅผ ๋ฐ์ต๋๋ค.set_train_parameter
: train์ ํ ๊ฒฝ์ฐ, param_dict๋ฅผ ๋๊ฒจ๋ฐ์ต๋๋ค.set_eval_parameter
: eval์ ํ ๊ฒฝ์ฐ, param_dict๋ฅผ ๋๊ฒจ๋ฐ์ต๋๋ค.
์ค์ ๋์ง ์์ ๊ฐ ๋ค์ default ๊ฐ์ผ๋ก ์ค์ ๋ฉ๋๋ค.
์ฝ๋์
HyperParameter
class ์ฐธ๊ณ
from ParentDir.core import neural_style
### For Evaluation
param_dict = {
"content_image": "/path/to/content.jpg",
"output_image": "/path/to/output.jpg",
"model": "/path/to/model/checkpoint.pth or mod.model"
}
cuda = 1 # True
param = neural_style.HyperParameter("eval", cuda, param_dict)
neural_style.stylize(param)
ํ์ต ์ค ์ป์ ์ ์๋ checkpoint์ .pth
ํ์ผ์ด๋, ํ์ต์ด ๋ชจ๋ ๋๋ ๋ค ์ป์ ์ ์๋ .model
ํ์ผ์ ์ง์ ํด์ ํ์ต๋ ์คํ์ผ๋ก content image๋ฅผ ๋ณํ์์ผ์ค๋๋ค.
from ParentDir.core import neural_style
### For Train
param_dict = {
# "transfer_learning": 1,
# "checkpoint_model_dir": "/path/to/checkpoint/",
"dataset": "/path/to/COCO",
"style_image": "/path/to/style.jpg",
"save_model_dir": "/path/to/save/"
}
cuda = 1 # True
param = neural_style.HyperParameter("train", cuda, param_dict)
neural_style.stylize(param)
train์ ํ๋๋ฐ, ์ด์ ํ์ต์์ ์ด์ด์ ํ๊ณ ์ถ์ ๊ฒฝ์ฐ transfer_learing ์ธ์๋ฅผ 1๋ก ํ๊ณ , checkpoint ํด๋๋ฅผ ์ง์ ํด๋๋ฉด ์ด์ด์ ์งํํฉ๋๋ค.
I used COCO 2014 Training images dataset [80K/13GB] (download).
--style-image
: path to style-image.--save-model-dir
: path to folder where trained model will be saved.--cuda
: set it to 1 for running on GPU, 0 for CPU.
Refer to neural_style/neural_style.py
for other command line arguments. For training new models you might have to tune the values of --content-weight
and --style-weight
. The mosaic style model shown above was trained with --content-weight 1e5
and --style-weight 1e10
. The remaining 3 models were also trained with similar order of weight parameters with slight variation in the --style-weight
(5e10
or 1e11
).
The program is written in Python, and uses pytorch, scipy. A GPU is not necessary, but can provide a significant speed up especially for training a new model. Regular sized images can be styled on a laptop or desktop using saved models.
Models for the examples shown below can be downloaded from here or by running the script download_saved_models.py
.