PyNet is an Artificial Intelligence and Machine Learning framework entirely built from Python.
Unlike Tensorflow or PyTorch, PyNet is intended for small-scale educational and experimental purposes, its meant to lower the barrier of entry for people intrested in machine learning and Artificial Intelligence using an easy-to-learn language while still implimenting key concepts.
At its core, PyNet is a Library full of layers, activation functions, models and APIs, PyNet also has some unorthodox layers that are mainly there for experimental purposes, more information in the documentation provided.
The main and first API built for the PyNet framework, everthing is built entirely on python with object-oriented programming as the main programming paradigm for each layer. But because everything is entirely built on Python, training can be quite slow depending on the machine and therefore, is not advised to serve as a general-purpose library; Other, more proffesionally developed libraries may suite the task at hand.
NetCore Features:
- Dense
- Localunit (Locally Connected Layer)
- Operation
- Multichannel Convolution
- Maxpooling
- Meanpooling
- Flatten
- Reshape
- RecurrentBlock
- RNN
- LSTM
- GRU
The second API of PyNet, this time built around the JAX ecosystem, leveraging parralelized JNP operations and JIT-compiled systems to boost calculations up to 5x the speed thanks to the XLA compiler. Everything is designed to be modular, so a custom layer can be passed into the sequential class as long as it adheres to NetFlash spesifications. More information on the documentation.
NetFlash Features:
- Dense
- Localunit (Locally Connected Layer)
- Multichannel Convolution
- Maxpooling
- Meanpooling
- Flatten
- Operation
- Recurrent Layer
- GRU (Gated Recurrent Unit) Layer
- LSTM (Long Sort Term Memory) Layer
- Multiheaded Self-Attention
The first implimentation of PyNet, a module containing functions for propagating, backpropagating and updating a neural network; the predecessor to PyNet. Be aware that advanced features such as optimizers and parametric functions are not available for this implimentation by default.
PyNet Alpha Features
- Initialize
- Propagate
- Backpropagate
- Update
- Default training function (not required, user can make their own)
Regressors
- Linear Regression
- Polynomial Regression
- Logistic Regression
- Exponential Regression
- Power Regression
- Logarithmic Regression
- Sinusoidal Regression (external model, does not use the PyNet framework)
Classifiers
- K-Nearest Neighbors
- Decision Tree
- Random Forest
- Naive Bayes
- SVM
Datasets
- Cluster (classification)
- Image (variations of the MNIST dataset as a python list)
- Regression
- Text
Tools
- Arraytools
- Logic
- Math
- Scaler
- Utility
- Visual
Many more featurs are present, but they are not as important as the features listed above or are internal features.
Dependancies
- NumPy
- Math
- Random
- Itertools
- Time
- Matplotlib
Apache License 2.0 This project is licensed under the MIT License - see LICENSE for more details.
While PyNet has an internal configuration file, it is not very extensive and is meant to provide default values to background processes, PyNet is all about transparancy and configurability per model and does not rely heavly on a central file.
Maintainer
- 2-Con
Contributors
- None
PyNet is almost completed, just some additional features and fixing and it should be mostly completed by August.
estimated date: August 2025 (uncertian + not guarenteed)