Skip to content

(OLD) Multi-layer feed-forward ANN with backpropagation using STL on MNIST data set (hand-written digit recognition).

Notifications You must be signed in to change notification settings

Hysperr/neural-network

Repository files navigation

Neural_Network

C++11 Required

Back-propagation algorithm for learning in multilayer networks

function BACK-PROP-LEARNING(examples, network) returns a neural network
	inputs: examples, a set of examples, each with input vector x and output vector y
		network, a multilayer network with L layers, weights wi,j activation function g
	local variables: Δ, a vector of errors, indexed by network node
	repeat
		for each weight wi,j in network do
			wi,j ← a small random number
		for each example (x, y) in examples do
			/* Propagate the inputs forward to compute the outputs */
			for each node i in the input layer do
				ai ← xi
			for  = 2 to L do
				for each node j in layer  do
					inj ← Σi wi,j ai
					aj ← g(inj)
			/* Propagate deltas backward from output layer to input layer */
			for each node j in the output layer do
				Δ[i] ← g' (ini) Σj wi,j Δ[j]
			for ℓ = L - 1 to 1 do
				for each node i in layer  do
					Δ[i] ← g' (ini) Σj wi,j Δ[j]
			/* Update every weight in network using deltas */
			for each weight wi,j in network do
				wi,j ← wi,j + α x αi x Δ[j]
	until some stopping criterion is satisfied
	return network

These files are part of a personal attempt to create a Convolutional Neural Network (CNN/ConvNet). The two classes that make the base neural net are tightly coupled such that they act as a single unit, only separated. In short, the Node class provides the building blocks to build the NeuralNet class which allows users to create multiple feed-forward multilayered artificial neural networks with backpropagation. From there a linker class will join these such that we have another neural net object where each node is itself a neural net.

Included Files

  • A driver file, digit training file, and test file are provided as a base demonstration.
  • A configuration file provided to tweak your neural net characteristics.

Function Call Order

  1. Start by initializing your constant variables
    • number of input nodes

    • number of output nodes

    • learning rate

  2. Create a std::map for hidden layer nodes
    • key = layer number (start from 0, increment by 1), value = nodes per layer

  3. Call constructor NeuralNet()
    • The default parameter to include bias nodes is false. true to activate

  4. Call set_output_identity()
    • pass in a std::map to label your output nodes

    • key = output node (start from 0, increment by 1), value = output node's identity

  5. Insert data using insert_data()
    • pass in a std::vector holding the data

  6. Call forward_propagate()
    • activates and passes data

  7. Call choose_answer() to have net select its answer
    • The default parameter to print neural net's belief values is false. true to activate

    • If testing after training DO NOT backpropagate! Go to step 9

  8. Call back_propagate()
    • trains network by updating weights

  9. If running multiple epochs, remember to clear network!

Congrats! You've Run A Neural Network :D

About

(OLD) Multi-layer feed-forward ANN with backpropagation using STL on MNIST data set (hand-written digit recognition).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published