Skip to content

Renovamen/flint

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

homemade-pytorch

Flint

A toy deep learning framework implemented in Numpy from scratch with a PyTorch-like API. I'm trying to make it as clean as possible.

Flint is not as powerful as torch, but it is still able to start a fire.

 

Installation

git clone https://github.com/Renovamen/flint.git
cd flint
python setup.py install

or

pip install git+https://github.com/Renovamen/flint.git --upgrade

 

Documentation

Documentation is available here.

 

Example

Add these imports:

import flint
from flint import nn, optim, Tensor

Build your net first:

class Net(nn.Module):
    def __init__(self, in_features, n_classes):
        super(MLP, self).__init__()

        self.l1 = nn.Linear(in_features, 5)
        self.l2 = nn.Linear(5, n_classes)
        self.relu = nn.ReLU()

    def forward(self, x):
        out = self.l1(x)
        out = self.relu(out)
        out = self.l2(out)
        return out

Or you may prefer to use a Sequential container:

class Net(nn.Module):
    def __init__(self, in_features, n_classes):
        super(MLP, self).__init__()
        self.model = nn.Sequential(
            nn.Linear(in_features, 5)
            nn.ReLU(),
            nn.Linear(5, n_classes)
        )

    def forward(self, x):
        out = self.model(x)
        return out

Define these hyper parameters:

# training parameters
n_epoch = 20
lr = 0.001
batch_size = 5

# model parameters
in_features = 10
out_features = 2

Here we generate a fake dataset:

import numpy as np
inputs = np.random.rand(batch_size, in_features)
targets = np.random.randint(0, n_classes, (batch_size, ))
x, y = Tensor(inputs), Tensor(targets)

Initialize your model, optimizer and loss function:

net = Net(in_features, n_classes)
net.train()
optimer = optim.Adam(params=net.parameters(), lr=lr)
loss_function = nn.CrossEntropyLoss()

Then we can train it:

for i in range(n_epoch):
    # clear gradients
    optimer.zero_grad()
    
    # forward prop.
    scores = net(x)

    # compute loss and do backward prop.
    loss = loss_function(scores, y)
    loss.backward()
    
    # update weights
    optimer.step()

    # compute accuracy
    preds = scores.argmax(dim=1)
    correct_preds = flint.eq(preds, y).sum().data
    accuracy = correct_preds / y.shape[0]

    # print training status
    print(
        'Epoch: [{0}][{1}/{2}]\t'
        'Loss {loss:.4f}\t'
        'Accuracy {acc:.3f}'.format(
            epoch + 1, i + 1, len(train_loader),
            loss = loss.data,
            acc = accuracy
        )
    )

Check the examples folder for more detailed examples.

 

Features / To-Do List

Autograd

Support autograding on the following operations:

  • Add
  • Substract
  • Negative
  • Muliply
  • Divide
  • Matmul
  • Power
  • Natural Logarithm
  • Exponential
  • Sum
  • Max
  • Softmax
  • Log Softmax
  • View
  • Transpose
  • Permute
  • Squeeze
  • Unsqueeze
  • Padding

Layers

  • Linear
  • Convolution (1D / 2D)
  • MaxPooling (1D / 2D)
  • Unfold
  • RNN
  • Flatten
  • Dropout
  • BatchNormalization
  • Sequential
  • Identity

Optimizers

  • SGD
  • Momentum
  • Adagrad
  • RMSprop
  • Adadelta
  • Adam

Loss Functions

  • Cross Entropy
  • Negative Log Likelihood
  • Mean Squared Error
  • Binary Cross Entropy

Activation Functions

  • ReLU
  • Leaky ReLU
  • Sigmoid
  • Tanh
  • GELU

Initializers

Others

  • Dataloaders
  • Support GPU

 

License

MIT

 

Acknowledgements

Flint is inspired by the following projects:

About

A toy deep learning framework implemented in pure Numpy from scratch. Aka homemade PyTorch lol.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages