A Torch-like C++ framework for building neural network graphs. Runs on an automatic differentiation tensor library built on top of vecLib (with plans to move to a more cross plaform framework). A small collection of prebuilt layers, loss functions and optimisers are avaiable.
Design the neural network object you want to use from the nn::Net
base object.
Create the forward feed function using the autodiff::var
for variables and Parameter for parameters.
Prebuilt layers with parameters are available.
class Network : public nn::Net{
public:
Model() : nn::Net(), fc1(10, 5, this), fc2(5, 1, this){}
nn::FullyConnected fc1, fc2;
autodiff::Var forward(autodiff::Var& x){
auto y = fc1(x);
auto z = fc2(y);
return z;
}
};
After initialization the network is registered with an optimizer which has access to the parameters and their gradients.
Network net;
opt::GD Optim(net.params);
Training is performed using the forward(x)
method, and backpropigation using backwardProp(l)
.
Optimization is then performed using the step()
method.
auto output = net.forward(x);
auto l = loss::SoftMax(output, targets);
net.backward(l);
opt.step();