This repository contains the C++ implementation of Andrej Karpathy's micrograd
.
micrograd
is a tiny Autograd engine. It's a minimalist, educational codebase, to understand how backpropagation works in popular deep learning frameworks.
To get a local copy up and running, follow these simple steps.
- A modern (C++20) compiler
- CMake
- Clone the repo
git clone https://github.com/akdemironur/micrograd-cpp.git
- Build the project
cd micrograd-cpp mkdir build & cd build cmake .. make
To utilize the micrograd-cpp
library, it is required to generate std::shared_ptr
objects (which are typedef'd as ValuePtr
). Following this, the backward()
function should be invoked. A simple example:
auto a = std::make_shared<Value>(-4.0, "a");
auto b = std::make_shared<Value>(2.0, "b");
auto c = a + b;
auto d = a * b + pow(b, 3);
c = c + c + 1;
c = c + 1 + c + (-a);
d = d + d * 2 + relu(b + a);
d = d + 3 * d + relu(b - a);
auto e = c - d;
auto f = pow(e, 2);
auto g = f / 2.0;
g = g + 10.0 / f;
std::cout << std::format("{:.4f}", g->data()) << std::endl;
g->backward();
std::cout << std::format("{:.4f}", a->grad()) << std::endl;
std::cout << std::format("{:.4f}", b->grad()) << std::endl;
g->printDOT("g.dot");
The computation graph can be visualized using the printDOT(std::string fileName)
method. This method generates a file with a dot representation that can be utilized with tools such as Graphviz. Below is a potential representation: