# alexshtf / autodiff

A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
C#
Latest commit 0a3365f Jul 30, 2018
Type Name Latest commit message Commit time
Failed to load latest commit information. .github/ISSUE_TEMPLATE Jul 30, 2018 AutoDiff Jul 10, 2018 docs Nov 11, 2017 .gitignore .travis.yml Readme.md Jan 27, 2018 appveyor.yml license.md Jul 10, 2018

# Project Description

A library that provides moderately fast, accurate, and automatic differentiation (computes derivative / gradient) of mathematical functions.

AutoDiff provides a simple and intuitive API for computing function gradients/derivatives along with a fast algorithm for performing the computation. Such computations are mainly useful in iterative numerical optimization scenarios.

# Code example

```using AutoDiff;

class Program
{
public static void Main(string[] args)
{
// define variables
var x = new Variable();
var y = new Variable();
var z = new Variable();

// define our function
var func = (x + y) * TermBuilder.Exp(z + x * y);

// prepare arrays needed for evaluation/differentiation
Variable[] vars = { x, y, z };
double[] values = {1, 2, -3 };

// evaluate func at (1, 2, -3)
double value = func.Evaluate(vars, values);

// calculate the gradient at (1, 2, -3)

// print results
Console.WriteLine("The value at (1, 2, -3) is " + value);
}
}```

# Documentation

The Documentation contains some basic tutorials, we have an article on CodeProject, and finally source code contains some code examples in addition to the code of the library itself.

# Motivation

There are many open and commercial .NET libraries that have numeric optimization as one of their features (for example, Microsoft Solver Foundation, AlgLib,Extreme Optimization, CenterSpace NMath) . Most of them require the user to be able to evaluate the function and the function's gradient. This library tries to save the work in manually developing the function's gradient and coding it. Once the developer defines his/her function, the AutoDiff library can automatically evaluate and differentiate this function at any point. This allows easy development and prototyping of applications which require numerical optimization.

# Features

• Moderate execution speeds. We aim computing a gradient within no more than 50 times the duration of function evaluation by manually tuned code.
• Composition of functions using arithmetic operators, Exp, Log, Power and user-defined unary and binary functions.
• Function gradient evaluation at specified points
• Function value evaluation at specified points
• Computes gradients using Reverse-Mode AD algorithm in linear time, which is substantially faster than numerical gradient approximation for multivariate functions.

# Using in research papers

If you like the library and it helps you publish a research paper, please cite the paper I originally wrote the library for geosemantic.bib

# Used by

You can’t perform that action at this time.