A Theano framework for building and training neural networks
Latest commit 699e270 Jan 9, 2017 @rizar rizar committed on GitHub Merge pull request #1174 from mnuke/mnuke/progressbar-indices
fix progressbar's iter_per_epoch


https://travis-ci.org/mila-udem/blocks.svg?branch=master https://readthedocs.org/projects/blocks/badge/?version=latest https://requires.io/github/mila-udem/blocks/requirements.svg?branch=master


Blocks is a framework that helps you build neural network models on top of Theano. Currently it supports and provides:

  • Constructing parametrized Theano operations, called "bricks"
  • Pattern matching to select variables and bricks in large models
  • Algorithms to optimize your model
  • Saving and resuming of training
  • Monitoring and analyzing values during training progress (on the training set as well as on test sets)
  • Application of graph transformations, such as dropout

In the future we also hope to support:

  • Dimension, type and axes-checking
See Also:
  • Fuel, the data processing engine developed primarily for Blocks.
  • Blocks-examples for maintained examples of scripts using Blocks.
  • Blocks-extras for semi-maintained additional Blocks components.
Citing Blocks

If you use Blocks or Fuel in your work, we'd really appreciate it if you could cite the following paper:

Bart van Merriënboer, Dzmitry Bahdanau, Vincent Dumoulin, Dmitriy Serdyuk, David Warde-Farley, Jan Chorowski, and Yoshua Bengio, "Blocks and Fuel: Frameworks for deep learning," arXiv preprint arXiv:1506.00619 [cs.LG], 2015.

Please see the documentation for more information.
If you want to contribute, please make sure to read the developer guidelines.