Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modify qml.grad so that it stores and makes accessible the value of the intermediate forward pass #914

Merged
merged 10 commits into from
Nov 20, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 17 additions & 0 deletions .github/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -274,6 +274,23 @@
restart the kernel/runtime.
[(#907)](https://github.com/PennyLaneAI/pennylane/pull/907)

* When using `grad_fn = qml.grad(cost)` to compute the gradient of a cost function with the Autograd
interface, the value of the intermediate forward pass is now available via the `grad_fn.forward`
property:
[(#914)](https://github.com/PennyLaneAI/pennylane/pull/914)

```python
def cost_fn(x, y):
return 2*np.sin(x[0])*np.exp(-x[1]) + x[0]**3 + np.cos(y)

params = np.array([0.1, 0.5], requires_grad=True)
data = np.array(0.65, requires_grad=False)
grad_fn = qml.grad(cost_fn)

grad_fn(params, data) # perform backprop and evaluate the gradient
grad_fn.forward # the cost function value
```

<h3>Breaking changes</h3>

- The ``VQECost`` class has been renamed to ``ExpvalCost`` to reflect its general applicability
Expand Down
2 changes: 1 addition & 1 deletion doc/code/qml.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ qml
.. automodapi:: pennylane
:no-heading:
:include-all-objects:
:skip: iter_entry_points, Version, Spec, plugin_devices, plugin_converters, default_config, reload
:skip: iter_entry_points, Version, Spec, plugin_devices, plugin_converters, default_config, reload, make_vjp, unary_to_nary, vspace
91 changes: 72 additions & 19 deletions pennylane/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,9 @@
import pkg_resources

import numpy as _np
from autograd import grad as _grad
from autograd.wrap_util import unary_to_nary
from autograd.core import make_vjp as _make_vjp, make_jvp as _make_jvp
from autograd.extend import vspace
from autograd import jacobian as _jacobian

from semantic_version import Version, Spec
Expand Down Expand Up @@ -227,18 +229,24 @@ def circuit():
raise DeviceError("Device does not exist. Make sure the required plugin is installed.")


def grad(func, argnum=None):
make_vjp = unary_to_nary(_make_vjp)
trbromley marked this conversation as resolved.
Show resolved Hide resolved


class grad:
"""Returns the gradient as a callable function of (functions of) QNodes.

Function arguments with the property ``requires_grad`` set to ``False``
will automatically be excluded from the gradient computation, unless
the ``argnum`` keyword argument is passed.

When the output gradient function is executed, both the forward pass
*and* the backward pass will be performed in order to
compute the gradient. The value of the forward pass is available via the
:attr:`~.forward` property.

Args:
func (function): a plain QNode, or a Python function that contains
a combination of quantum and classical nodes
josh146 marked this conversation as resolved.
Show resolved Hide resolved

Keyword Args:
argnum (int, list(int), None): Which argument(s) to take the gradient
with respect to. By default, the arguments themselves are used
to determine differentiability, by examining the ``requires_grad``
Expand All @@ -250,31 +258,76 @@ def grad(func, argnum=None):
function with respect to the differentiable arguments, or, if specified,
the arguments in ``argnum``.
"""
# pylint: disable=no-value-for-parameter
if argnum is not None:
# for backwards compatibility with existing code
# that manually specifies argnum
return _grad(func, argnum)

def _gradient_function(*args, **kwargs):
"""Inspect the arguments for differentiability, and
compute the autograd gradient function with required argnums
dynamically.
def __init__(self, fun, argnum=None):
self._forward = None
self._grad_fn = None

This wrapper function is returned to the user instead of autograd.grad,
so that we can take into account cases where the user computes the
gradient function once, but then calls it with arguments that change
in differentiability.
self._fun = fun
self._argnum = argnum

if self._argnum is not None:
# If the differentiable argnum is provided, we can construct
# the gradient function at once during initialization
self._grad_fn = self._grad_with_forward(fun, argnum=argnum)

def _get_grad_fn(self, args):
"""Get the required gradient function.

* If the differentiable argnum was provided on initialization,
this has been pre-computed and is available via self._grad_fn

* Otherwise, we must dynamically construct the gradient function by
inspecting as to which of the parameter arguments are marked
as differentiable.
"""
if self._grad_fn is not None:
return self._grad_fn

# Inspect the arguments for differentiability, and
# compute the autograd gradient function with required argnums
# dynamically.
argnum = []

for idx, arg in enumerate(args):
if getattr(arg, "requires_grad", True):
argnum.append(idx)

return _grad(func, argnum)(*args, **kwargs)
return self._grad_with_forward(
self._fun,
argnum=argnum,
)

def __call__(self, *args, **kwargs):
"""Evaluates the gradient function, and saves the function value
calculated during the forward pass in :attr:`.forward`."""
grad_value, ans = self._get_grad_fn(args)(*args, **kwargs)
self._forward = ans
return grad_value

@property
def forward(self):
"""float: The result of the forward pass calculated while performing
backpropagation. Will return ``None`` if the backpropagation has not yet
been performed."""
return self._forward

@staticmethod
@unary_to_nary
def _grad_with_forward(fun, x):
"""This function is a replica of ``autograd.grad``, with the only
difference being that it returns both the gradient *and* the forward pass
value."""
vjp, ans = _make_vjp(fun, x)

if not vspace(ans).size == 1:
raise TypeError(
"Grad only applies to real scalar-output functions. "
"Try jacobian, elementwise_grad or holomorphic_grad."
)

return _gradient_function
grad_value = vjp(vspace(ans).ones())
return grad_value, ans
Comment on lines +329 to +330
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! So these are basically the main lines that are changed...



def jacobian(func, argnum=None):
Expand Down
80 changes: 79 additions & 1 deletion tests/test_classical_gradients.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"""
Sanity checks for classical automatic gradient formulas (without QNodes).
"""

import autograd
import pytest

import pennylane as qml
Expand Down Expand Up @@ -231,6 +231,84 @@ def test_linear(self, tol):
assert np.allclose(auto_grad, correct_grad, atol=tol, rtol=0)


class TestGrad:
"""Unit tests for the gradient function"""

def test_non_scalar_cost_gradient(self):
"""Test gradient computation with a non-scalar cost function raises an error"""

def cost(x):
return np.sin(x)

grad_fn = qml.grad(cost, argnum=[0])
arr1 = np.array([0.0, 1.0, 2.0], requires_grad=True)

with pytest.raises(TypeError, match="only applies to real scalar-output functions"):
grad_fn(arr1)

def test_agrees_with_autograd(self, tol):
"""Test that the grad function agrees with autograd"""

def cost(x):
return np.sum(np.sin(x) * x[0] ** 3)

grad_fn = qml.grad(cost)
params = np.array([0.5, 1.0, 2.0], requires_grad=True)
res = grad_fn(params)
expected = autograd.grad(cost)(params)

assert np.allclose(res, expected, atol=tol, rtol=0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice with rtol=0. 🔥


def test_forward_pass_value_storing(self, tol):
"""Test that the intermediate forward pass value is accessible and correct"""

def cost(x):
return np.sum(np.sin(x) * x[0] ** 3)

grad_fn = qml.grad(cost)
params = np.array([-0.654, 1.0, 2.0], requires_grad=True)

assert grad_fn.forward is None

grad = grad_fn(params)

res = grad_fn.forward
expected = cost(params)
assert np.allclose(res, expected, atol=tol, rtol=0)

# change the parameters
params2 = np.array([1.4, 1.0, 2.0], requires_grad=True)
grad = grad_fn(params2)

res = grad_fn.forward
expected = cost(params2)
assert np.allclose(res, expected, atol=tol, rtol=0)
Comment on lines +279 to +285
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍


def test_no_argnum_grad(self, mocker, tol):
"""Test the qml.grad function for inferred argnums"""
cost_fn = lambda x, y: np.sin(x) * np.cos(y) + x * y ** 2

x = np.array(0.5, requires_grad=True)
y = np.array(0.2, requires_grad=True)

grad_fn = qml.grad(cost_fn)
spy = mocker.spy(grad_fn, "_grad_with_forward")

res = grad_fn(x, y)
expected = np.array([np.cos(x) * np.cos(y) + y ** 2, -np.sin(x) * np.sin(y) + 2 * x * y])
assert np.allclose(res, expected, atol=tol, rtol=0)
assert spy.call_args_list[0][1]["argnum"] == [0, 1]

x = np.array(0.5, requires_grad=True)
y = np.array(0.2, requires_grad=False)
spy.call_args_list = []

res = grad_fn(x, y)
expected = np.array([np.cos(x) * np.cos(y) + y ** 2])
assert np.allclose(res, expected, atol=tol, rtol=0)
assert spy.call_args_list[0][1]["argnum"] == [0]


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I couldn't see anywhere in the tests a check for qml.grad where we have multiple variables, some with requires_grad=True and some with requires_grad=False, might be worth adding? (I could see this for qml.jacobian() though).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have added another test!

class TestJacobian:
"""Tests for the jacobian function"""

Expand Down