Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

Lapstyle: Laplacian-Steered Neural Style Transfer

Code and test images for the paper "Laplacian-Steered Neural Style Transfer".

Lapstyle extends an existing neural style transfer method with one or multiple Laplacian loss layers. The following three neural style transfer implementations have been extended:

The implementation by Justin Johnson clearly produces the best images (either the original neural_style.lua or the extended lap_style.lua). The corresponding content and style losses are also the smallest. Its superiority seems to be ascribed to the L-BFGS optimization, since the algorithm is otherwise identical to Anish Athalye's implementation.


The setup procedures are the same as those of each original project. The following procedures for lap_style.lua are quoted from

Dependencies: * [torch7]( * [loadcaffe](

Optional dependencies:

After installing dependencies, you'll need to run the following script to download the VGG model:

sh models/

This will download the original VGG-19 model.

Sample usage:

th lap_style.lua -style_image images/flowers.png -content_image images/megan.png -output_image output/megan_flowers20_100.png -content_weight 20 -lap_layers 2 -lap_weights 100

Sample images:

The four images in each group are: 1) content image, 2) style image, 3) image synthesized with the original Gatys-style, and 4) image synthesized with Lapstyle.

Note: although photo-realistic style transfer[3] ( performs amazingly well on their test images, it doesn't work on the images we tested. Seems that in order to make it work well, the content image and the style image has to have highly similar layout and semantic contents.


You are welcome to cite the paper ( with this bibtex:

  author    = {Shaohua Li and Xinxing Xu and Liqiang Nie and Tat-Seng Chua},
  title     = {Laplacian-Steered Neural Style Transfer},
  booktitle = {Proceedings of the ACM Multimedia Conference (MM), to appear.},
  year      = {2017},


[1] Leon A Gatys, Alexander S Ecker,and Matthias Bethge. 2016. Image style transfer using convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2414–2423.

[2] Chuan Li and Michael Wand. 2016. Combining markov random fields and convolutional neural networks for image synthesis. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2479–2486.

[3] Fujun Luan, Sylvain Paris, Eli Shechtman, and Kavita Bala. 2017. Deep Photo Style Transfer. arXiv preprint arXiv:1703.07511 (2017).


Laplacian-steered Neural Style Transfer for generating more appealing images







No releases published


No packages published