NeuPy is a Tensorflow based python library for prototyping and building neural networks
Clone or download

README.rst

Travis Coverage Dependency Status License


⚠️ NeuPy now uses Tensorflow as a computational backend ⚠️

Starting from the version 0.7.0 neupy uses Tensorflow as a computational backend for deep learning models. All the Theano users can still use neupy with the old backend after installing latest version before the 0.7.0 release (0.6.*). Documentation for the Theano versions can be downloaded from the website or generated from the code.


NeuPy v0.7.1

NeuPy is a python library for prototyping and building neural networks. NeuPy uses Tensorflow as a computational backend for deep learning models.

Installation

$ pip install neupy

User Guide

Articles and Notebooks

Growing Neural Gas

Growing Neural Gas is an algorithm that learns topological structure of the data.

Code that generates animation can be found in this ipython notebook

Making Art with Growing Neural Gas

Growing Neural Gas is another example of the algorithm that follows simple set of rules that on a large scale can generate complex patterns.

Image on the left is a great example of the art style that can be generated with simple set fo rules.

The main notebook that generates image can be found here

Self-Organizing Maps and Applications

Self-Organazing Maps (SOM or SOFM) is a very simple and powerful algorithm that has a wide variety of applications. This articles covers some of them, including:

  • Visualizing Convolutional Neural Networks
  • Data topology learning
  • High-dimensional data visualization
  • Clustering

Visualizing CNN based on Pre-trained VGG19

This notebook shows how you can easely explore reasons behind convolutional network predictions and understand what type of features has been learned in different layers of the network.

In addition, this notebook shows how to use neural network architectures in NeuPy, like VGG19, with pre-trained parameters.

Visualize Algorithms based on the Backpropagation

Image on the left shows comparison between paths that different algorithm take along the descent path. It's interesting to see how much information about the algorithm can be extracted from simple trajectory paths. All of this covered and explained in the article.

Hyperparameter optimization for Neural Networks

This article covers different approaches for hyperparameter optimization.

  • Grid Search
  • Random Search
  • Hand-tuning
  • Gaussian Process with Expected Improvement
  • Tree-structured Parzen Estimators (TPE)

The Art of SOFM

In this article, I just want to show how beautiful sometimes can be a neural network. I think, it’s quite rare that algorithm can not only extract knowledge from the data, but also produce something beautiful using exactly the same set of training rules without any modifications.

Discrete Hopfield Network

Article with extensive theoretical background about Discrete Hopfield Network. It also has example that show advantages and limitations of the algorithm.

Image on the left is a visulatization of the information stored in the network. This picture not only visualizes network's memory, ot shows everything network knows about the world.

Create unique text-style with SOFM

This article describes step-by-step solution that allow to generate unique styles with arbitrary text.

Playing with MLP visualizations

This notebook shows interesting ways to look inside of your MLP network.

Exploring world with Value Iteration Network (VIN)

One of the basic applications of the Value Iteration Network that learns how to find an optimal path between two points in the environment with obstacles.

Features learned by Restricted Boltzmann Machine (RBM)

Set of examples that use and explore knowledge extracted by Restricted Boltzmann Machine