Skip to content
Code for understanding automatic differentiation.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
LICENSE
README.md
dmath.hs
dual.hs
dvmath.hs
taylor.hs
taylorz.hs

README.md

df

Automatic differentiation is being built into programming languages such as Swift and Julia and added as libraries to languages to Python. This is changing the way many optimization problems are expressed and the sophistication with which deep learning models are being made.

This repo is an attempt to explain and understand automatic differentiation at the conceptual level. I may choose to elaborate it later on with algorithms for efficient implementation, but, as of this writing, the main intention is illustration, understanding and exploration.

Blog posts on the topic

  1. AD - differentiable functions - by treating functions of a single variable as numbers.
  2. AD - Dual numbers and Taylor numbers
  3. AD - Higher ranked beings - generalized vectors and tensors.

Source code index

  1. dmath.hs - Differentiable functions
  2. dual.hs - Dual numbers
  3. taylor.hs - Taylor numbers. Not exactly a known term, but this is definition using which all derivatives at a point for a given function can be calculated .. meaning we can produce arbitrary Taylor series approximations using this approach and hence I'm calling these "Taylor numbers".
  4. taylorz.hs - Same as taylor.hs above, but with special support for zero to prune the expression tree.
  5. dvmath.hs - Applying the approach of dmath.hs but for functions of vectors, tensors and such higher ranked objects. This introduces more operations such as inner/outer products, convolutions, slices, etc. .. compared to the usual number protocol.
You can’t perform that action at this time.