Crino: a neural-network library based on Theano
Crino lets you "hand-craft" neural-network architectures, using a modular framework inspired by Torch. Our library also provides standard implementations for :
- auto-encoders (AE)
- multi-layer perceptrons (MLP)
- deep neural networks (DNN)
- input-output deep architectures (IODA)
IODA is a specialization of the DNNs, specifically designed for cases where you have to deal with high-dimensional input and output spaces. The input and output layers are initialized with an unsupervised pre-training step. Then, the backpropagation algorithm performs the supervised learning final step. This process is based on the stacked auto-encoder strategy, commonly used by DNN training algorithms.
We are currently writing an article on IODA, we'll inform you as soon as it is ready to publish.
- Install Crino :
cd to/your/preferred/path git clone https://github.com/jlerouge/crino.git cd crino sudo python setup.py install
- Run the given example :
cd example chmod +x example.py ./example.py
- Adapt it to your needs! Crino is natively compatible with Matlab-like data or any format handled by SciPy/NumPy.
- Check the project documentation
- What does "device gpu is not available" mean ? Your GPU card may not be compatible with CUDA technology (check http://www.geforce.com/hardware/technology/cuda/supported-gpus). If so, there is nothing to do. Otherwise, your theano installation may have a problem (see http://deeplearning.net/software/theano/install.html#using-the-gpu).
- Where does the name "Crino" come from ? We developed this library as an extension of Theano. In Greek mythology, Crino is the daughter of Theano.
You can contact us with the following e-mail address : email@example.com.
Feel free to open a new issue in case you have found a bug in Crino.
Crino is based on Theano :
- J. Bergstra, O. Breuleux, F. Bastien, P. Lamblin, R. Pascanu, G. Desjardins, J. Turian, D. Warde-Farley and Y. Bengio. “Theano: A CPU and GPU Math Expression Compiler”. Proceedings of the Python for Scientific Computing Conference (SciPy) 2010. June 30 - July 3, Austin, TX
IODA is based on B. Labbé's et al. work :
- B. Labbé, R. Hérault and C. Chatelain . “Learning Deep Neural Networks for High Dimensional Output Problems”. In IEEE International Conference on Machine Learning and Applications (ICMLA'09), December 2009.
Copyright (c) 2014 Clément Chatelain, Romain Hérault, Julien Lerouge, Romain Modzelewski (LITIS - EA 4108).
All rights reserved.
This program is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.
The sample data (located in
example/data) are free of use.