Find file History
Latest commit f642777 Jan 5, 2013 Andrew Gibiansky added dog cnn project
Permalink
..
Failed to load latest commit information.
README.md updated new READMEs Dec 9, 2012
activations.png added activation images for neural network Dec 15, 2012
cumulative_plot.m
encode_labels.m updated Dec 15, 2012
evaluate.m updated Dec 15, 2012
minimize.m updated Dec 15, 2012
mnist_all.mat adding Simple language + assembly, as well as MATLAB neural networks Dec 9, 2012
neural_network.m
neural_network_cost.m
normalize.m added dog cnn project Jan 4, 2013
predict.m
roll.m adding Simple language + assembly, as well as MATLAB neural networks Dec 9, 2012
shuffle_data.m
sigmoid.m adding Simple language + assembly, as well as MATLAB neural networks Dec 9, 2012
sigmoidGradient.m adding Simple language + assembly, as well as MATLAB neural networks Dec 9, 2012
softmax.m adding Simple language + assembly, as well as MATLAB neural networks Dec 9, 2012
test.m adding Simple language + assembly, as well as MATLAB neural networks Dec 9, 2012
theta.mat adding Simple language + assembly, as well as MATLAB neural networks Dec 9, 2012
unroll.m

README.md

This is an implementation of a feed-forward backpropogating neural network in Matlab. It achieves over 95% recognition accuracy on MNIST digit recognition data.

To train it, use the following code:

s = rng; % Store random number generator state
params = test; % Train network on random subset (and do cross-validation)
params.iterations = 300; % Set iterations
rng(s);
theta = test(params); % Train final network
rng(s);
test(theta); % Test final network