{{ message }}
/ autodidact Public

# mattjj/autodidact

Switch branches/tags
Nothing to show

## Files

Failed to load latest commit information.
Type
Name
Commit time

# Autodidact: a pedagogical implementation of Autograd

This is a tutorial implementation based on the full version of Autograd.

Example use:

```>>> import autograd.numpy as np  # Thinly-wrapped numpy
>>>
>>> def tanh(x):                 # Define a function
...     y = np.exp(-2.0 * x)
...     return (1.0 - y) / (1.0 + y)
...
0.41997434161402603
>>> (tanh(1.0001) - tanh(0.9999)) / 0.0002  # Compare to finite differences
0.41997434264973155```

We can continue to differentiate as many times as we like, and use numpy's vectorization of scalar-valued functions across many different input values:

```>>> import matplotlib.pyplot as plt
>>> x = np.linspace(-7, 7, 200)
>>> plt.plot(x, tanh(x),
...          x, grad(tanh)(x),                                # first  derivative
>>> plt.show()```

Autograd was written by Dougal Maclaurin, David Duvenaud and Matt Johnson. See the main page for more information.

## Releases

No releases published

## Packages 0

No packages published

•
•
•