Neural network experiments written purely in numpy
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
.ipynb_checkpoints
MNIST_data
static
uloss
.DS_Store
.gitignore
README.md
hebb-pset5.ipynb
mnist_nn.ipynb
numpy-mnist.ipynb
synthetic-gradients.ipynb
synthetic_vs_stale.ipynb
synthgrad.py

README.md

np_nets

Neural network experiments written purely in numpy

  1. Learning backprop with the MNIST classification task
  1. Better version of learning backprop...lacks markdown explanations...doesn't have a bug though
  1. Synthetic gradients with the MNIST classification task
  1. Hebbian learning (for a Dartmouth class)
  1. "U loss" learning
  • I test an ansatz for layer-wise training of neural networks. It didn't work. That's how research goes.
  • Folder is here