function BACK-PROP-LEARNING(examples, network) returns a neural network inputs: examples, a set of examples, each with input vector x and output vector y network, a multilayer network with L layers, weights wi,j activation function g local variables: Δ, a vector of errors, indexed by network node repeat for each weight wi,j in network do wi,j ← a small random number for each example (x, y) in examples do /* Propagate the inputs forward to compute the outputs */ for each node i in the input layer do ai ← xi for ℓ = 2 to L do for each node j in layer ℓ do inj ← Σi wi,j ai aj ← g(inj) /* Propagate deltas backward from output layer to input layer */ for each node j in the output layer do Δ[i] ← g' (ini) Σj wi,j Δ[j] for ℓ = L - 1 to 1 do for each node i in layer ℓ do Δ[i] ← g' (ini) Σj wi,j Δ[j] /* Update every weight in network using deltas */ for each weight wi,j in network do wi,j ← wi,j + α x αi x Δ[j] until some stopping criterion is satisfied return network
These files are part of a personal attempt to create a Convolutional Neural Network (CNN/ConvNet). The two classes that make the base neural net are tightly coupled such that they act as a single unit, only separated. In short, the Node class provides the building blocks to build the NeuralNet class which allows users to create multiple feed-forward multilayered artificial neural networks with backpropagation. From there a linker class will join these such that we have another neural net object where each node is itself a neural net.
- A driver file, digit training file, and test file are provided as a base demonstration.
- A configuration file provided to tweak your neural net characteristics.
- Start by initializing your constant variables
- Create a
std::map
for hidden layer nodes - Call constructor
NeuralNet()
- Call
set_output_identity()
- Insert data using
insert_data()
- Call
forward_propagate()
- Call
choose_answer()
to have net select its answer - Call
back_propagate()
- If running multiple epochs, remember to clear network!