Skip to content

AK3847/Bishograd

Repository files navigation

Bishõgrad

A Bishõ Autograd engine in python along with a lightweight deep neural-network library! (inspired from mircrograd by Andrej Karpathy)

what's Bishõ?

微小 - Bishõ is Japanese word for 'tiny' since my implementation is very tinyyyyy ^_^ compared to PyTorch/Tensorflow

what's Hako?

箱 - Hako is Japanese word for 'box' , here Hako signify the neurons in our network ;D

Installation

  • Install via pip:
    pip install bishograd==0.1.0
    

Example

Wonder how this works? Checkout the examples.

Targets :

  • ReLU Activation function
  • Add MLP.training() to automate the whole training code
  • Add Sigmoid, LeakyReLU & other activation functions
  • Add loss functions - categorical loss, mean-square loss etc

Contribution :

This project is open for contribution!

  • Clone this repository:
git clone https://github.com/AK3847/Bishograd.git

Primary way to contribute is to either raise an issue or a pull request with proper description and code format. You can contribute to any Targets or suggest new features as well.

Drop a star if this project helped you in anyway! ( ノ ゚ー゚)ノ