Skip to content
forked from HIPS/autograd

Efficiently computes derivatives of numpy code.

License

Notifications You must be signed in to change notification settings

qiwsir/autograd

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Autograd

Autograd is an automatic differentiation package for Python, using native Python and Numpy syntax. It can handle a large subset of Python's features, including loops, ifs, recursion and even closures. It uses reverse-mode differentiation (a.k.a. backpropagation), meaning it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments. The main intended application is gradient-based optimization.

Example use:

import autograd.numpy as np
import matplotlib.pyplot as plt
from autograd import grad

def fun(x):
    return np.sin(x)

d_fun = grad(fun)    # First derivative
dd_fun = grad(d_fun) # Second derivative

x = np.linspace(-10, 10, 100)
plt.plot(x, map(fun, x), x, map(d_fun, x), x, map(dd_fun, x))

The function can even have control flow, which raises the prospect of differentiating through an iterative routine like an optimization. Here's a simple example.

# Taylor approximation to sin function
def fun(x):
    currterm = x
    ans = currterm
    for i in xrange(1000):
        currterm = - currterm * x ** 2 / ((2 * i + 3) * (2 * i + 2))
        ans = ans + currterm
        if np.abs(currterm) < 0.2: break # (Very generous tolerance!)

    return ans

d_fun = grad(fun)
dd_fun = grad(d_fun)

x = np.linspace(-10, 10, 100)
plt.plot(x, map(fun, x), x, map(d_fun, x), x, map(dd_fun, x))

We can take the derivative of the derivative automatically as well, as many times as we like:

# Define the tanh function
def tanh(x):
    return (1.0 - np.exp(-x))  / ( 1.0 + np.exp(-x))

d_fun = grad(tanh)           # First derivative
dd_fun = grad(d_fun)         # Second derivative
ddd_fun = grad(dd_fun)       # Third derivative
dddd_fun = grad(ddd_fun)     # Fourth derivative
ddddd_fun = grad(dddd_fun)   # Fifth derivative
dddddd_fun = grad(ddddd_fun) # Sixth derivative

x = np.linspace(-7, 7, 200)
plt.plot(x, map(tanh, x),
         x, map(d_fun, x),
         x, map(dd_fun, x),
         x, map(ddd_fun, x),
         x, map(dddd_fun, x),
         x, map(ddddd_fun, x),
         x, map(dddddd_fun, x))

Examples:

How to install:

Simply run

git clone --depth 1 --branch master https://github.com/HIPS/autograd.git
cd autograd/
python setup.py install

Authors:

Dougal Maclaurin and David Duvenaud

We thank Matthew Johnson, Jasper Snoek, and the rest of the HIPS group (led by Ryan P. Adams) for helpful contributions. We thank Analog Devices International and Samsung Advanced Institute of Technology for their support.

About

Efficiently computes derivatives of numpy code.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%