This project is a C++ implementation of Andrej Karpathy's Micrograd, a lightweight library for performing automatic differentiation. The library is built from scratch and demonstrates foundational concepts of computational graphs and backpropagation. By creating this project in C++, it explores performance optimizations, advanced programming techniques, and challenges unique to the language.
- Value Class: Supports scalar values and automatic differentiation with backpropagation.
- Operators: Includes overloaded operators for addition, subtraction, multiplication, division, and power.
- Activation Functions: Implements
tanh
andexp
functions. - Backward Propagation: Computes gradients for scalars through a computational graph.
- A C++ compiler supporting C++11 or later (e.g., GCC, Clang).
- CMake (optional, for building the project).
git clone https://github.com/Siddharthm10/micrograd-cpp.git
cd micrograd-cpp
- Make the micrograd_build.sh executable
chmod +x micrograd_build.sh
- Run the executable
./micrograd_build.sh
g++ -std=c++11 -o micrograd main.cpp
./micrograd
Here is an example demonstrating the basic functionality of the library:
#include <iostream>
#include "value.h"
int main() {
Value a(2.0);
Value b(3.0);
Value c = a + b;
Value d = c * a;
d.backward();
std::cout << "d: " << d << std::endl;
std::cout << "c: " << c << std::endl;
std::cout << "a.grad: " << a.getGrad() << std::endl;
std::cout << "b.grad: " << b.getGrad() << std::endl;
return 0;
}
Output:
e: Value(data=1, grad=0)
d: Value(data=10, grad=1)
c: Value(data=5, grad=2)
a.grad: 7
b.grad: 2
.
├── src
│ ├── main.cpp # Entry point for the program
│ ├── value.cpp # Implementation of the Value class
│ └── value.h # Header file for the Value class
├── CMakeLists.txt # Build configuration for CMake
└── README.md # Project documentation
- Support Scalars: Add support for scalar operations to broaden usability and refine the computational graph.
- Neurons and Layers:
- Implement a
Neuron
class to represent individual computational units. - Create a
Layer
class to manage groups of neurons.
- Implement a
- MLP: Build a
MLP
(Multi-Layer Perceptron) class to connect multiple layers and support forward and backward propagation for neural networks. - Optimization: Explore performance improvements for gradient computation.
- Documentation: Add detailed documentation for all functions and classes.
- Unit Tests: Implement test cases for core features to ensure correctness.
Contributions are welcome! Feel free to open issues or submit pull requests for bug fixes, enhancements, or new features.
This project is open-source and available under the MIT License.
- Andrej Karpathy for the original Micrograd.
- The C++ community for best practices and inspiration.
For any questions or discussions, feel free to reach out to Siddharth Mehta.