Tensorflow Wrappers for Neural Networks in Python
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
LICENSE
conv_example.py
example.py
readme.md
tensorboard.sh
utils.py

readme.md

Tensorflow Wrappers for Neural Networks

Introduction

Neural Networks can handle:

  • Points represented by a fixed-dimensionality vector;
  • 2D-Space maps representing, for example, images;
  • Time-depedent functions representing, for example, time series, sound signals (not yet in this toolbox!).

Requirements

Python 2.7 or 3.x and Tensorflow (1.1 or later)

Contents

In this toolbox, we provide wrappers for:

  • Tensorboard visualizations: use sh tensorboard.sh to print an URL that you should copy-paste in your favorite web browser (Chrome for example).

  • Linear Units with activations: see example.py (useful for vectors)

  • Convolutions with activations: see conv_example.py (useful for images)

  • Dropout: see both examples with the keep_prob and dropout variables (useful for regularization)

  • Batch-Normalization: activate (or not) the batch_normalization boolean at the beginning of each code examples (it dramatically improves the training convergence speed)

We provide 3 examples:

  • example.py: Basic Neural Network (MLP = Multi-Layered Perceptron) applied for MNIST handwritten digits image classification
  • conv_example.py: Convolutional Neural Network (CNN) applied for MNIST handwritten digits image classification
  • rnn_example.py: Recurrent Neural Network (RNN) for pseudo-periodic time series classification -- not yet.

Authors

Warith HARCHAOUI, Astrid MERCKLING -- MAP5 -- Université Paris Descartes -- 2017