Skip to content

mdsarfarazulh/deep-texture-synthesis-cnn-keras

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

deep-texture-synthesis-cnn-keras

In this project I have implemented Textures Synthesis Using Convolutional Neural Networks paper by Gatys et.al. In this paper they have introduce a new model of natural textures generation based on the feature spaces of convolutional neural networks optimised for object recognition. Within the model, textures are represented by the correlations between feature maps in several layers of the network. They showed that across layers the texture representations increasingly capture the statistical properties of natural images while making object information more and more explicit. Their model provides a new tool to generate stimuli for neuroscience and might offer insights into the deep representations learned by convolutional neural networks.

Input Image


Output Image

The following output is generated after 500 iterations, you can control number of iterations by passing the value of iteration in the function buildTexture

Reference

  • Leon A. Gatys, Alexander S. Ecker: “Texture Synthesis Using Convolutional Neural Networks”, 2015; arXiv:1505.07376.
  • Implementation of Style Transfer by François Chollet in Neural Style Transfer

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages