Skip to content

Laplacian-steered Neural Style Transfer for generating more appealing images

License

Notifications You must be signed in to change notification settings

askerlee/lapstyle

Repository files navigation

Lapstyle: Laplacian-Steered Neural Style Transfer

Code and test images for the paper "Laplacian-Steered Neural Style Transfer".

Lapstyle extends an existing neural style transfer method with one or multiple Laplacian loss layers. The following three neural style transfer implementations have been extended:

The implementation by Justin Johnson clearly produces the best images (either the original neural_style.lua or the extended lap_style.lua). The corresponding content and style losses are also the smallest. Its superiority seems to be ascribed to the L-BFGS optimization, since the algorithm is otherwise identical to Anish Athalye's implementation.

Setup:

The setup procedures are the same as those of each original project. The following procedures for lap_style.lua are quoted from https://github.com/jcjohnson/neural-style:

Dependencies: * [torch7](https://github.com/torch/torch7) * [loadcaffe](https://github.com/szagoruyko/loadcaffe)

Optional dependencies:

After installing dependencies, you'll need to run the following script to download the VGG model:

sh models/download_models.sh

This will download the original VGG-19 model.

Sample usage:

th lap_style.lua -style_image images/flowers.png -content_image images/megan.png -output_image output/megan_flowers20_100.png -content_weight 20 -lap_layers 2 -lap_weights 100

Sample images:



The four images in each group are: 1) content image, 2) style image, 3) image synthesized with the original Gatys-style, and 4) image synthesized with Lapstyle.

Note: although photo-realistic style transfer[3] (https://github.com/luanfujun/deep-photo-styletransfer) performs amazingly well on their test images, it doesn't work on the images we tested. Seems that in order to make it work well, the content image and the style image has to have highly similar layout and semantic contents.

Citation

You are welcome to cite the paper (https://arxiv.org/abs/1707.01253) with this bibtex:

@InProceedings{lapstyle,
  author    = {Shaohua Li and Xinxing Xu and Liqiang Nie and Tat-Seng Chua},
  title     = {Laplacian-Steered Neural Style Transfer},
  booktitle = {Proceedings of the ACM Multimedia Conference (MM), to appear.},
  year      = {2017},
}

References

[1] Leon A Gatys, Alexander S Ecker,and Matthias Bethge. 2016. Image style transfer using convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2414–2423.

[2] Chuan Li and Michael Wand. 2016. Combining markov random fields and convolutional neural networks for image synthesis. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2479–2486.

[3] Fujun Luan, Sylvain Paris, Eli Shechtman, and Kavita Bala. 2017. Deep Photo Style Transfer. arXiv preprint arXiv:1703.07511 (2017).

About

Laplacian-steered Neural Style Transfer for generating more appealing images

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published