Skip to content

A fast auto differentiation engine implemented in C++ 🔥

Notifications You must be signed in to change notification settings

rod-rom/autograd.cpp

Repository files navigation

autograd.cpp

A scalar value automatic differentiation engine in C++. (WIP)

What is automatic differentiation?

Automatic differentiation is a computational technique used to efficiently compute derivates of functions. This process is vital in gradient-based optimization methods like stochastic gradient descent. There are two modes in auto differentiation, forward mode and reverse mode. Forward mode evaluates the intermediate variables and stores the expression tree (also called a computational graph) in memory. Then, in the reverse mode, we compute the partial derivates of the output w.r.t the intermediate variables.

auto

Motivation

The reason why I wanted to build this project was because I wanted learn the C++ language. Most high performance machine learning packages are written in C/C++, such as pytorch and numpy. So I took it upon myself to create this project in order to understand the technology that drives these libraries.

🤝 Contributing

Clone the repo

git clone https://github.com/<username>/autograd.cpp.git
cd autograd.cpp

Submit a pull request

If you'd like to contribute, please fork the repository and open a pull request.

About

A fast auto differentiation engine implemented in C++ 🔥

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages