This repo adds automatic differentiation to pystencils.
Install via pip:
pip install pystencils-autodiff
or if you downloaded this repository using:
pip install -e .
Then, you can access the submodule pystencils.autodiff.
import pystencils.autodiff
Create a pystencils.AssignmentCollection with pystencils:
import sympy
import pystencils
z, y, x = pystencils.fields("z, y, x: [20,30]")
forward_assignments = pystencils.AssignmentCollection({
z[0, 0]: x[0, 0] * sympy.log(x[0, 0] * y[0, 0])
})
print(forward_assignments)
Subexpressions:
Main Assignments:
z[0,0] ← x_C*log(x_C*y_C)
You can then obtain the corresponding backward assignments:
from pystencils.autodiff import AutoDiffOp, create_backward_assignments
backward_assignments = create_backward_assignments(forward_assignments)
print(backward_assignments)
You can see the derivatives with respective to the two inputs multiplied by the gradient diffz_C of the output z_C.
Subexpressions:
Main Assignments:
\hat{x}[0,0] ← diffz_C*(log(x_C*y_C) + 1)
\hat{y}[0,0] ← diffz_C*x_C/y_C
You can also use the class AutoDiffOp to obtain both the assignments (if you are curious) and auto-differentiable operations for Tensorflow...
op = AutoDiffOp(forward_assignments)
backward_assignments = op.backward_assignments
tensorflow_op = op.create_tensorflow_op(backend='tensorflow_native', use_cuda=True)
... or Torch:
torch_op = op.create_tensorflow_op(backend='torch_native', use_cuda=True)