Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure that MottonenStatePreparation is fully differentiable #1361

Open
mariaschuld opened this issue May 25, 2021 · 3 comments
Open

Ensure that MottonenStatePreparation is fully differentiable #1361

mariaschuld opened this issue May 25, 2021 · 3 comments

Comments

@mariaschuld
Copy link
Contributor

mariaschuld commented May 25, 2021

This issue has been tagged for contributions during unitaryHACK

Issue description

At the moment, the MottonenStatePreparation template that prepares a quantum state whose Dirac vector encodes some specified input vector, is not fully differentiable. This is why the test cases for differentiability/interfaces are currently reduced to a trivial example.

  • Expected behavior: (What you expect to happen)

The template should be fully differentiable with respect to state_vector in all interfaces (tf, torch, autograd/numpy, jax).

  • Actual behavior: (What actually happens)

At the moment, differentiation in some interfaces does not work. This is likely due to the complicated input processing pipelines in the template, which were rewritten in PennyLane's math library to be interface-agnostic, but which do not seem to work 100% yet.

A detailed investigation about what cases work and which ones don't may be part of the solution to this issue.

Additional information

A minimum goal could be to copy the differentiation tests from another template (like this class of tests example) and to make it pass. To come up with a circuit that one can compare the template against, one would have to figure out the decomposition (and how angles of the gates depend on the input vector to the template) for a very simple case, which may be a bit tricky.

MottonenStatePreparation is called by AmplitudeEmbedding and QubitStateVector if the device needs a decomposition. By extension, these two templates/operations should become differentiable on more devices.

@mariaschuld mariaschuld changed the title Ensure that MottonenStatePreparation is fully differentiable [unitaryHACK] Ensure that MottonenStatePreparation is fully differentiable May 25, 2021
@mariaschuld mariaschuld added the unitaryhack Dedicated issue for Unitary Fund open-source hackathon label May 25, 2021
@josh146 josh146 removed the unitaryhack Dedicated issue for Unitary Fund open-source hackathon label May 31, 2021
@josh146 josh146 changed the title [unitaryHACK] Ensure that MottonenStatePreparation is fully differentiable Ensure that MottonenStatePreparation is fully differentiable May 31, 2021
@josh146
Copy link
Member

josh146 commented Jun 1, 2021

The following discussion forum post provides a bug fix to get differentiability working with PyTorch: https://discuss.pennylane.ai/t/hybrid-network-not-differentiating/1079/3

@josh146
Copy link
Member

josh146 commented Jun 2, 2021

In particular, here is the relevant git diff:

diff --git a/pennylane/templates/state_preparations/mottonen.py b/pennylane/templates/state_preparations/mottonen.py
index 860eeb80..a7a83544 100644
--- a/pennylane/templates/state_preparations/mottonen.py
+++ b/pennylane/templates/state_preparations/mottonen.py
@@ -203,6 +203,9 @@ def _get_alpha_y(a, n, k):
     with np.errstate(divide="ignore", invalid="ignore"):
         division = numerator / denominator

+    division = qml.math.cast(division, np.float64)
+    denominator = qml.math.cast(denominator, np.float64)
+
     division = qml.math.where(denominator != 0.0, division, 0.0)

     return 2 * qml.math.arcsin(qml.math.sqrt(division))

@antalszava
Copy link
Contributor

The fix was incorporated in #1400. One test that reproduced the bug was added, though we could extend that with the entire class of tests as suggested in the original description.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants