Skip to content

A minimal example of reverse-mode automatic differentiation (aka backpropagation)

Notifications You must be signed in to change notification settings

chanshing/reverse_ad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

A minimal example of reverse-mode automatic differentiation imitating the PyTorch interface. We create a bunch of Variable objects, do simple operations with them and then and call .backward() to compute the gradient components, accessible via .grad:

x1 = Variable(1.0)
x2 = Variable(2.0)

a1 = x1*x2
a2 = x1/x2

b1 = a1/a2
b2 = a1*a2

y = b1-b2

y.backward()

# -2.0 4.0
print x1.grad, x2.grad

About

A minimal example of reverse-mode automatic differentiation (aka backpropagation)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages