Find file History
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

Beyond Backpropagation: Automatic Differentiation

15. March 2017

Supervised machine learning can be formulated as the optimization of model parameters to minimize prediction loss on a training set. This optimization is often done using iterative gradient descent. The backpropagation algorithm is a clever method to significantly reduce the computation time needed to compute the gradient and make training of neural networks possible. But backpropagation is just a special case of a much more general tool: automatic differentiation. This talk introduces the ideas behind automatic differentiation and shows why it is better suited for many machine learning applications than both numeric and symbolic differentiation.

The talk of this meetup was held by Matthias Richter.