A scalar value automatic differentiation engine in C++. (WIP)
Automatic differentiation is a computational technique used to efficiently compute derivates of functions. This process is vital in gradient-based optimization methods like stochastic gradient descent. There are two modes in auto differentiation, forward mode and reverse mode. Forward mode evaluates the intermediate variables and stores the expression tree (also called a computational graph) in memory. Then, in the reverse mode, we compute the partial derivates of the output w.r.t the intermediate variables.
The reason why I wanted to build this project was because I wanted learn the C++ language. Most high performance machine learning packages are written in C/C++, such as pytorch and numpy. So I took it upon myself to create this project in order to understand the technology that drives these libraries.
git clone https://github.com/<username>/autograd.cpp.git
cd autograd.cpp
If you'd like to contribute, please fork the repository and open a pull request.