Skip to content

This repo is the implementation of perturbation based algorithms for training neural networks - we evaluate the scalability of node perturbation with network width and depth on modern datasets such as MNIST, CIFAR, etc.

yashsmehta/perturbations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Perturbations

❗❗ NOTE: Here is the link to our new project repository, which is much better documented and contains more comprehensive experiments. The best part is, it's written in JAX so it's also much easier to follow!

This repository contains a set of experiments written in Tensorflow (version 1.x) to explore the node perturbation algorithm for training deep fully connected neural networks.

Usage

Run master.py to train a multi-layer perceptron (with sgd or np).

python master.py -lr 0.1 -update_rule np -n_hl 3 -hl_size 300 -n_epochs 5

All are optional arguments:

lr: learning rate

update_rule: either 'np' or 'sgd'

network will have (n_hl) depth and constant width of size (hl_size)

About

This repo is the implementation of perturbation based algorithms for training neural networks - we evaluate the scalability of node perturbation with network width and depth on modern datasets such as MNIST, CIFAR, etc.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published