Skip to content

Siddharthm10/micrograd-cpp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Micrograd Implementation in C++

Overview

This project is a C++ implementation of Andrej Karpathy's Micrograd, a lightweight library for performing automatic differentiation. The library is built from scratch and demonstrates foundational concepts of computational graphs and backpropagation. By creating this project in C++, it explores performance optimizations, advanced programming techniques, and challenges unique to the language.

Features

  • Value Class: Supports scalar values and automatic differentiation with backpropagation.
  • Operators: Includes overloaded operators for addition, subtraction, multiplication, division, and power.
  • Activation Functions: Implements tanh and exp functions.
  • Backward Propagation: Computes gradients for scalars through a computational graph.

Getting Started

Prerequisites

  • A C++ compiler supporting C++11 or later (e.g., GCC, Clang).
  • CMake (optional, for building the project).

Cloning the Repository

git clone https://github.com/Siddharthm10/micrograd-cpp.git
cd micrograd-cpp

Building the Project

Using CMake:

  1. Make the micrograd_build.sh executable
chmod +x micrograd_build.sh
  1. Run the executable
./micrograd_build.sh

Using G++:

g++ -std=c++11 -o micrograd main.cpp
./micrograd

Example Usage

Here is an example demonstrating the basic functionality of the library:

#include <iostream>
#include "value.h"

int main() {
    Value a(2.0);
    Value b(3.0);
    Value c = a + b;
    Value d = c * a;

    d.backward();

    std::cout << "d: " << d << std::endl;
    std::cout << "c: " << c << std::endl;
    std::cout << "a.grad: " << a.getGrad() << std::endl;
    std::cout << "b.grad: " << b.getGrad() << std::endl;

    return 0;
}

Output:

e: Value(data=1, grad=0)
d: Value(data=10, grad=1)
c: Value(data=5, grad=2)
a.grad: 7
b.grad: 2

Directory Structure

.
├── src
│   ├── main.cpp       # Entry point for the program
│   ├── value.cpp      # Implementation of the Value class
│   └── value.h        # Header file for the Value class
├── CMakeLists.txt     # Build configuration for CMake
└── README.md          # Project documentation

TODO

  • Support Scalars: Add support for scalar operations to broaden usability and refine the computational graph.
  • Neurons and Layers:
    • Implement a Neuron class to represent individual computational units.
    • Create a Layer class to manage groups of neurons.
  • MLP: Build a MLP (Multi-Layer Perceptron) class to connect multiple layers and support forward and backward propagation for neural networks.
  • Optimization: Explore performance improvements for gradient computation.
  • Documentation: Add detailed documentation for all functions and classes.
  • Unit Tests: Implement test cases for core features to ensure correctness.

Contributing

Contributions are welcome! Feel free to open issues or submit pull requests for bug fixes, enhancements, or new features.

License

This project is open-source and available under the MIT License.

Acknowledgements

  • Andrej Karpathy for the original Micrograd.
  • The C++ community for best practices and inspiration.

Contact

For any questions or discussions, feel free to reach out to Siddharth Mehta.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published