Skip to content

Deep neural networks without the learning cliff! A wrapper library compatible with scikit-learn.

License

Notifications You must be signed in to change notification settings

mangwang/scikit-neuralnetwork

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

scikit-neuralnetwork

Deep neural network implementation without the learning cliff! This library implements multi-layer perceptrons, auto-encoders and (soon) recurrent neural networks with a stable Future Proof™ interface that's compatible with scikit-learn for a more user-friendly and Pythonic interface. It's a wrapper for powerful existing libraries such as lasagne now, and soon keras or blocks.

NOTE: This project is possible thanks to the nucl.ai Conference on July 18-20. Join us in Vienna!

Build Status Documentation Status Code Coverage License Type Project Stars


Features

Thanks to the underlying Lasagne implementation, this library supports the following neural network features, which are exposed in an intuitive and well documented API:

  • Activation Functions — Sigmoid, Tanh, Rectifier, Softmax, Linear.
  • Layer Types — Convolution (greyscale and color, 2D), Dense (standard, 1D).
  • Learning Rules — sgd, momentum, nesterov, adadelta, adagrad, rmsprop, adam.
  • Regularization — L1, L2 and dropout.
  • Dataset Formats — numpy.ndarray, scipy.sparse, coming soon: iterators.

If a feature you need is missing, consider opening a GitHub Issue with a detailed explanation about the use case and we'll see what we can do.

Installation & Testing

A) Download Latest Release [Recommended]

If you want to use the latest official release, you can do so from PYPI directly:

> pip install scikit-neuralnetwork

This will install a copy of Lasagne and other minor packages too as a dependency. We highly recommend you use a virtual environment for Python.

B) Pulling Repositories [Optional]

You'll need some dependencies, which you can install manually as follows:

> pip install numpy scipy theano lasagne

Once that's done, you can grab this repository and install from setup.py in the exact same way:

> git clone https://github.com/aigamedev/scikit-neuralnetwork.git
> cd scikit-neuralnetwork; python setup.py develop

This will make the sknn package globally available within Python as a reference to the current directory.

Running Automated Tests

Then, you can run the samples and benchmarks available in the examples/ folder, or launch the tests to check everything is working:

> pip install nose
> nosetests -v sknn.tests

docs/console_tests.png

We strive to maintain 100% test coverage for all code-paths, to ensure that rapid changes in the underlying backend libraries are caught automatically.

Demonstration

To run a visualization that uses the sknn.mlp.Classifier just run the following command in the project's root folder:

> python examples/plot_mlp.py --params activation

There are multiple parameters you can plot as well, for example iterations, rules or units. The datasets are randomized each time, but the output should be an image that looks like this...

docs/plot_activation.png

Benchmarks

The following section compares nolearn (and lasagne) vs. sknn (and pylearn2) by evaluating them as a black box. In theory, these neural network models are all the same, but in practice every implementation detail can impact the result. Here we attempt to measure the differences in the underlying libraries.

The results shown are from training for 10 epochs for two-thirds of the original MNIST data, on two different machines:

  1. GPU Results: NVIDIA GeForce GTX 650 (Memory: 1024Mb, Cores: 384) on Ubuntu 14.04.
  2. CPU Results: Intel Core i7 2Ghz (256kb L2, 6MB L3) on OSX Mavericks 10.9.5.

You can run the following command to reproduce the benchmarks on your machine:

> python examples/bench_mnist.py (sknn|lasagne)

... to generate the statistics below (e.g. over 25 runs).

MNIST sknn.mlp (CPU) nolearn.lasagne (CPU) sknn.mlp (GPU) nolearn.lasagne (GPU)
Accuracy 97.99%±0.046 97.77% ±0.054 98.00%±0.06 97.76% ±0.06
Training 20.1s ±1.07 45.7s ±1.10 33.10s ±0.11 31.93s ±0.09

All the neural networks were setup as similarly as possible, given parameters that can be controlled within the implementation and their interfaces. In particular, this model has a single hidden layer with 300 hidden units of type Rectified Linear (ReLU) and trained with the same data with validation and monitoring disabled. The remaining third of the MNIST dataset was only used to test the score once training terminated.

WARNING: These numbers should not be considered definitive and fluctuate as the underlying libraries change. If you have any ideas how to make the accuracy results similar, then please submit a Pull Request on the benchmark script.

Getting Started

The library supports both regressors (to estimate continuous outputs from inputs) and classifiers (to predict labels from features). This is the sklearn-compatible API:

from sknn.mlp import Classifier, Layer

nn = Classifier(
    layers=[
        Layer("Rectifier", units=100),
        Layer("Linear")],
    learning_rate=0.02,
    n_iter=10)
nn.fit(X_train, y_train)

y_valid = nn.predict(X_valid)

score = nn.score(X_test, y_test)

The generated documentation as a standalone page where you can find more information about parameters, as well as examples in the User Guide.

Links & References

  • Lasagne by benanne — The amazing neural network library that powers sknn.
  • Theano by LISA Lab — Underlying array/math library for efficient computation.
  • scikit-learn by INRIA — Machine learning library with an elegant Pythonic interface.
  • nolearn by dnouri — Similar wrapper library for Lasagne compatible with scikit-learn.

Build Status Documentation Status Code Coverage License Type Project Stars

About

Deep neural networks without the learning cliff! A wrapper library compatible with scikit-learn.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%