Skip to content

Simple and basic machine learning algorithms implemented from scratch in NumPy.

License

Notifications You must be signed in to change notification settings

TzuChieh/Simple-NumPy-ML

Repository files navigation

Simple-NumPy-ML

This is my testground for the basics of machine learning. The algorithms in this project are all implemented in pure NumPy (see ./requirements.txt) with Python 3, and contains many training presets for you to explore. To get a taste of what this project has to offer, let us start by training an AI for recognizing handwritten digits by following these steps:

git clone https://github.com/TzuChieh/Simple-NumPy-ML.git
cd Simple-NumPy-ML
python example.py

This will start training a very basic model right away. From the console output, you should see something like this:

Epoch 6 / 10:
SGD: [████████████████████████████████████████] 100.00% (10.75 ms/batch, 0.00 mins left) | eval perf: 9343 / 10000 (0.9343) | eval cost: 0.4105 | Δt_epoch: 0:00:16.809815 | Δt_report: 0:00:01.866717
Epoch 7 / 10:
SGD: [████████████____________________________]  31.09% (10.89 ms/batch, 0.20 mins left)

Let it run a while until epoch 10 is reached, the training will stop and outputs a trained model in ./output/MNIST Basic Network.model. From the log you can see that this model has around 95% accuracy in recognizing handwritten digits.

Features

Currently most of the features are for building neural networks. Some visualization utilities are also provided for analyzing generated data.

Layers

  • Reshape
  • Fully Reshape (similar to Reshape, but as a layer wrapper)
  • Fully Connected (also commonly known as dense layer)
  • Convolution (arbitrary kernel shape and stride; supports both tied and untied biases)
  • Pool (arbitrary kernel shape and stride; supports max and mean pooling)
  • Dropout

All layers support the following initialization modes (if applicable): Zeros, Ones, Constant, Gaussian, LeCun, Xavier, Kaiming He.

Source code (layers): ./model/layer.py Source code (parameter initializers): ./model/initializer.py

Activation Functions

  • Identity (simply pass through variables)
  • Sigmoid
  • Tanh
  • Softmax
  • ReLU
  • Leaky ReLU

Source code: ./model/activation.py

Cost Functions

  • Quadratic (also known as MSE, L2)
  • Cross Entropy

Source code: ./model/cost.py

Optimizers

  • Stochastic Gradient Descent (SGD, with momentum)
  • Adaptive moment estimation (Adam)

All optimizers support:

  • Mini-batch
  • L2 Regularization
  • Gradient Clipping (by norm)
  • Multi-core synchronous parameter update
  • Multi-core asynchronous parameter update
    • With gradient staleness compensation

Source code: ./model/optimizer.py

Visualization

  • A simple GUI program for viewing training reports

Source code: ./gui/

Interesting Reads

This project is inspired by a collection of resources that I found useful on the Internet:

Some Notes

  • You DO NOT need a GPU to train your model.
  • I do not care the execution speed of the constructed model as the main purpose of this project is for me to understand the basics in the field. It is slow, but can still get the job done in a reasonable amount of time (for small networks).
  • Currently convolution/correlation is implemented in the naive way (sliding a kernel across the matrix). Ideally, both the feedforward and backpropagation pass of convolutional layer can be implemented as matrix multiplications.

Datasets

MNIST

The MNIST dataset for training and evaluation of handwritten digits are obtained from Yann LeCun's MNIST page. This dataset is included in ./dataset/mnist/.

Fashion-MNIST

The Fashion-MNIST dataset contains small grayscale images of clothing and can be obtained from their GitHub repository. This is a MNIST-like dataset that is designed to be a drop-in replacement of MNIST, with greater variety and harder to predict. This dataset is included in ./dataset/fashion_mnist/.

CIFAR-10

CIFAR-100

Additional Dependencies (optional)

As mentioned earlier, the only required third-party library is NumPy. Additional libraries can be installed to support more functionalities (see ./requirements_extra.txt). To install all dependencies in one go, pick the requirement files of your choice and execute (using two files as an example)

pip install -r requirements.txt -r requirements_extra.txt

About

Simple and basic machine learning algorithms implemented from scratch in NumPy.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages