Artistic style transfer for videos.
The goal of this project is to use deep learning to perform style transfer on videos.
Style transfer is a way to apply the color scheme/texture of an artwork to other images or videos. Style transfer on static images has been widely researched. However, studies on video style transfer are fairly new. In this project, I applied the method from this paper. Their implementation was in lua with torch as the backend. I modified the Keras neural transfer example to incorporate the image warping and temporal constraints. Please refer to my blog for details. And I'd like to thank Manuel Ruder and Somshubra Majumdar for their wonderful github repos.
Please enjoy the following two videos that I created for this project:
The workflow is as follows:
- Scraped 20k images of paintings and pictures from Flickr to finetune the last convolutional layer of VGG16 so that it is able to distinguish paintings from pictures at 88%.
- Generate optical flow and weights for temporal constraints using deep matching.
- Perform style transfer on frames with initialization and temporal loss.
In this repo, jupyter notebooks for step 1 and 3 are included. Also included are the shell script and files that are used in step 2 (you will need additional files from Manuel Ruder's github to obtain optical flows).