Skip to content

JorgeGtz/TextureNets_implementation

Repository files navigation

TextureNets_implementation

PyTorch (version 0.4.1) implementation of the texture synthesis model in Texture Networks: Feed-forward Synthesis of Textures and Stylized Images of Ulyanov et al.

Based on Gatys' code

Training

The python script train_g2d_periodic.py trains a generator network. The code requires the libraries: numpy, PIL and torch. The VGG-19 perceptual loss between 2D images uses Gatys' implementation. To run the code you need to get the pytorch VGG19-Model from the bethge lab by running:

sh download_models.sh 

Using display is optional.

The name of the example texture is defined by the variable input_name.

The example textures go in the folder Textures.

The output file *params.pytorch contains the trained parameters of the generator network.

Sample

The python script sample_g2d_periodic.py loads the trained parameters and synthesizes a squared texture of size sample_size. The code requires the libraries: numpy and torch.
model_folder must be set as:

model_folder = 'Trained_models/[name of folder of trained model]'

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published