Skip to content

yehogwon/keratorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

97 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

KeraTorch : PyTorch-Like ML Framework

Introduction

This project is a toy project to implement a PyTorch-like ML framework. The goal is to implement commonly used layers and optimizers, such as fully-connected layer, convolutional layer, ReLU activation, SGD optimizer, etc.

πŸ“£ Shout Out To @omaraflak

This project is inspired by the Youtube video by Omar Aflak, where he explains how to implement neural networks from scratch.

In this project, auto-grad framework is supported. The framework is inspired by PyTorch Autograd. I am not sure if my version works similarly to PyTorch, but it is a good practice to implement auto-grad framework. My implementation is independent to PyTorch's one, so it may be poor in performance compared to PyTorch's one.

⛰️ GradArray

GradArray is a wrapper class of np.ndarray that supports auto-grad. It contains the array itself, (accumulately) computed gradient on the array, and the computational graph (i.e., how the array is computed).

Here is a demo code of GradArray.

import numpy as np
from common.array import GradArray

A = GradArray(np.array([[2, 3], [5, 6]], dtype=np.float32), name='A')
B = GradArray(np.array([3, 5], dtype=np.float32).reshape(2, 1), name='B')
C = A @ B

print(C)
print(C._array)

C.backward(np.ones_like(C._array)) # βˆ‚f/βˆ‚C = 1

print(A._grad) # βˆ‚f/βˆ‚A
print(B._grad) # βˆ‚f/βˆ‚B

output:

GradArray(name=A @ B, shape=(2, 1), grad_op=MatMulGrad)

[[21.] # C
 [45.]]

[[3. 5.] # gradient on A
 [3. 5.]]

[[7.] # gradient on B
 [9.]]

Environment Setup

This project is written in Python. The main dependencies are numpy, matplotlib, tqdm, and graphviz. You can install the whole dependencies by running the following command.

$ pip install -r requirements.txt

Note that you might need to install graphviz manually.

For Linux,

$ sudo apt-get install graphviz

For Mac,

$ brew install graphviz

Also, this project is built on Python 3.8.17. If you are using conda, you can create a new virtual environment for this project by running the following command.

$ conda create -n keratorch python=3.8.17
$ conda activate keratorch

Milestones

πŸ”₯ Feature, Not a Bug

  • It should be tested if networks operate properly without batch.
  • The gradient is not accumulated, which may cause problems when the same variable is used in different layers.
  • 🌟 Array class supporting auto gradient calculation
    • Basic array class
    • Basic operations on array
      • Addition
      • Subtraction
      • Scalar Multiplication
      • Scalar Division
      • Matrix Multiplication
      • Real Number Power
      • Expansion (vector -> 2d matrix)
      • Sum with Axis
      • L2 norm with Axis
      • Reshape
      • Backward Propagation
      • Element-wise Multiplication
      • Max / Min
      • Indexing
      • Slicing
      • In-place operations support
      • various dtype support (not only np.float64)
    • Gradient Calculation
      • Identity
      • Add (variable number of inputs)
      • Reshape
      • Transpose
      • Scalar Multiplication
      • Matrix Multiplication
      • Expansion
      • Sum
      • Power
      • Element-wise Multiplication
      • Max / Min
      • Indexing
      • Slicing
  • Overall backbone structure
    • Layer base class
    • Activation base class
    • Loss base class
    • Optimizer base class
  • layers
    • Fully-connected layer
    • Convolutional layer
    • Max pooling layer
    • Average pooling layer
    • Batch normalization layer
  • activation functions (and their derivatives)
    • ReLU
    • Sigmoid
    • Tanh
    • Softmax
  • Implement commonly used loss functions
    • Mean squared error
    • Cross entropy
  • Implement commonly used optimizers
    • SGD
    • Momentum
    • RMSProp
    • Adam
  • πŸ–₯️ Demo
    • OR binary classification
    • XOR binary classification
    • MNIST classification
  • Additional Features
    • computational graph visualization
    • initialization
    • dropout
    • CUDA support
  • GitHub CI/CD
    • Python Linting
    • Unit Testing
    • Deployment
  • Add more soon 😡

About

πŠπžπ«πšπ“π¨π«πœπ‘ is a PyTorch-like ML framework. Just a toy project.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Contributors