micrograd.c is a C implementation of a tiny autograd engine, inspired by Andrej Karpathy's micrograd.
The autograd engine handles automatic differentiation, enabling the computation of gradients for tensor operations such as addition, multiplication, power, and ReLU activation. The neural network components include neurons, layers, and multi-layer perceptrons (MLPs) for building and training models.
In the training loop, the code performs forward passes to compute the outputs and the loss, followed by a backward pass to compute gradients. The parameters are then updated using the Adam optimizer.
The tests validate the functionality of the autograd engine and neural network components. They check the correctness of gradient computations, forward and backward passes, and overall network behavior.
The Makefile is used to compile the project. It defines rules for building the test executables.
To build the project, run:
make
This will compile the source files and produce the test_engine and test_nn executables. You can then run these executables to test the autograd engine and neural network components.