Skip to content
A tensorflow implementation for fast neural style!
Branch: master
Clone or download
joy.he
Latest commit eeaa47d Nov 6, 2017
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
conf 1.0 Dec 15, 2016
img 1.0 Dec 15, 2016
nets change to 1.0 Mar 27, 2017
preprocessing change to 1.0 Mar 27, 2017
.gitignore 1.0 Dec 15, 2016
README.md Update README.md Mar 27, 2017
eval.py Fix "The image generated by eval.py is black" problem. May 20, 2017
export.py Add export.py to export .pb file. May 26, 2017
losses.py Fix "The image generated by eval.py is black" problem. May 20, 2017
model.py support python3 Nov 6, 2017
reader.py ok. it worked Dec 10, 2016
train.py Fix "The image generated by eval.py is black" problem. May 20, 2017
utils.py Add export.py to export .pb file. May 26, 2017

README.md

fast-neural-style-tensorflow

A tensorflow implementation for Perceptual Losses for Real-Time Style Transfer and Super-Resolution.

This code is based on Tensorflow-Slim and OlavHN/fast-neural-style.

Samples:

configuration style sample
wave.yml
cubist.yml
denoised_starry.yml
mosaic.yml
scream.yml
feathers.yml
udnie.yml

Requirements and Prerequisites:

  • Python 2.7.x
  • Now support Tensorflow >= 1.0

Attention: This code also supports Tensorflow == 0.11. If it is your version, use the commit 5309a2a (git reset --hard 5309a2a).

And make sure you installed pyyaml:

pip install pyyaml

Use Trained Models:

You can download all the 7 trained models from Baidu Drive.

To generate a sample from the model "wave.ckpt-done", run:

python eval.py --model_file <your path to wave.ckpt-done> --image_file img/test.jpg

Then check out generated/res.jpg.

Train a Model:

To train a model from scratch, you should first download VGG16 model from Tensorflow Slim. Extract the file vgg_16.ckpt. Then copy it to the folder pretrained/ :

cd <this repo>
mkdir pretrained
cp <your path to vgg_16.ckpt>  pretrained/

Then download the COCO dataset. Please unzip it, and you will have a folder named "train2014" with many raw images in it. Then create a symbol link to it:

cd <this repo>
ln -s <your path to the folder "train2014"> train2014

Train the model of "wave":

python train.py -c conf/wave.yml

(Optional) Use tensorboard:

tensorboard --logdir models/wave/

Checkpoints will be written to "models/wave/".

View the configuration file for details.

You can’t perform that action at this time.