Skip to content

madvn/neural_nets_from_scratch

develop
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 

Implementations of Neural Network models and training algorithms from scratch [WIP]

Install

# Navigate to this repo in the terminal
pip install -e .

Usage

from neural_nets.backprop import BackPropNet

Models and Algos

  • Feedforward network of arbitrary size and activation functions

  • Backprop with arbitrary cost function

  • Feedback alignment based training for FNNs [1]

  • Recurrent network with arbitrary size and activtion functions

  • Backprop through time [WIP]

  • RNN traninng with feedback alignment [2]

Dependencies

Numpy and Matplotlib - planning to use JAX for some of the grad operations.

References

[1] Lillicrap, Timothy P., et al. Random synaptic feedback weights support error backpropagation for deep learning. Nature communications 7 (2016): 13276

[2] Murray, J. M. (2018). Local online learning in recurrent networks with random feedback. BioRxiv, 458570.

About

Generic implementations of neural network models and training alogs from scratch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages