This package aims to simplify working with a range of AI problems. We keep interfaces of our code as simple as possible while maintaining reasonable flexibility for the extensions of the code.
At current, the following is possible:
- Supervised learning with deep, recurrent convoneural neural networks with PyTorch backend.
- Learning generative models such as VAE and GAN's for distributions simple and complex.
We reuse a lot scikit-learn
like interfaces. This yields a few benefits:
- No learning curve to use the code. In fact, you can use all the models same as you do it in sklearn.
- Efficient preprocessing of the data for datasets that fit in memory using
FeatureUnion
orPipeline
classes. One could possibly go far beyond memory size withdask
. - All machinery necessary for hyperparameter setting and selection. The code can be used
directly with GridSearchCV from
scikit-learn
or better yet with BayesSearchCV fromscikit-optimize
that is more efficient in number of model trainings.
Install using pip in terminal:
- If you only want to use code:
[sudo] pip install https://github.com/iaroslav-ai/noxer/archive/master.zip
- If you want to edit the code:
git clone https://github.com/iaroslav-ai/noxer.git
cd noxer
sudo pip install -e .
Documentation for the code is extracted from docstrings and is located at https://noxer-org.github.io/.
See example usage in examples
folder.
Icon made by Freepik from www.flaticon.com .