Skip to content

planetis-m/neuralnet-examples

Repository files navigation

Neural net examples

In this repo you will find examples of neural networks implimented from scratch using my matrix library.

Examples included

The perceptron algorithm. The activation function is a binary step function called "heaviside step function". Hinge loss as a loss function. Capable of binary classification. The example functions as an OR gate.

{
   "layers": [2, 1],
   "activation_function": ["heaviside"]
}

Two layer neural network with gradient descent. Both layers use sigmoid as activation function. XOR gate. Weights are initialized from a uniform distribution U(-sqrt(6 / (in + out)), sqrt(6 / (in + out))) (Xavier initialization).

{
   "layers": [2, 3, 1],
   "activation_function": ["sigmoid", "sigmoid"]
}

Same as previous except implemented the momentum method. It impovers training speed and accuracy (avoid getting stuck in a local minima). XOR gate

{
   "layers": [2, 5, 1],
   "activation_function": ["sigmoid", "sigmoid"]
}

Handwritten digit classification is a multi-label classification problem. The data set used is semeion.data.

{
   "layers": [256, 51, 10],
   "activation_function": ["sigmoid", "softmax"]
}

Same as the previous, data is split in small batches (subsets). Impoves memory efficiency, accuracy for a trade-off in compute efficiency. Uses root mean squared propagation, instead of SGD. And L2 regularization is applied for the model's weights.

{
   "layers": [256, 51, 10],
   "activation_function": ["leaky_relu", "softmax"]
}

Same as the previous, but with cross validation. Implements accuracy, precision, recall and f1-score metrics.

{
   "layers": [256, 51, 10],
   "activation_function": ["sigmoid", "softmax"]
}

DISCLAIMER: Only for learning purposes. Nim has its own machine learning framework Arraymancer as well as torch bindings.

Acknowledgments

License

This library is distributed under the MIT license.

About

Neural network examples

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages