Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Return-types #7] QNode integration with custom gradient and autograd #3041

Merged
merged 122 commits into from
Oct 25, 2022

Conversation

rmoyard
Copy link
Contributor

@rmoyard rmoyard commented Sep 8, 2022

Context:

This PR makes possible to use return types with QNodes and custom derivatives (finite-diff, param shift and adjoint) with Autograd.

Description of the Change:

  1. Implement a new custom_vjp functions which is now separated between multi and single measurment custom_vjp_single, custom_vjp_multi.
  2. Clean the autograd interface and adapt it to the new return types.
  3. Update test from test_autograd.py, test_autograd_qnode.py and test_vjp.py

Benefits:

The new return types system is almost fully compatible with Autograd. (no ragged arrays)

Possible Drawbacks:

  1. Users must apply autograd.numpy.hstack for multiple measurements or for second derivative.

  2. Potential slowdown due to post-processing.

TODO:

  1. Change the batch transform post processing function.
  2. Add CV derivatives
  3. Warning shots and state quantities entropies Vn entropy and mutual info shot vec validation check #3180

[sc-25813]

@rmoyard rmoyard marked this pull request as draft September 8, 2022 21:22
@github-actions
Copy link
Contributor

github-actions bot commented Sep 8, 2022

Hello. You may have forgotten to update the changelog!
Please edit doc/releases/changelog-dev.md with:

  • A one-to-two sentence description of the change. You may include a small working example for new features.
  • A link back to this PR.
  • Your name (or GitHub username) in the contributors section.

@codecov
Copy link

codecov bot commented Sep 8, 2022

Codecov Report

Merging #3041 (26d8324) into master (37a4024) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff            @@
##           master    #3041    +/-   ##
========================================
  Coverage   99.69%   99.69%            
========================================
  Files         275      275            
  Lines       24012    24170   +158     
========================================
+ Hits        23938    24097   +159     
+ Misses         74       73     -1     
Impacted Files Coverage Δ
pennylane/_grad.py 100.00% <ø> (ø)
pennylane/gradients/__init__.py 100.00% <100.00%> (ø)
pennylane/gradients/finite_difference.py 100.00% <100.00%> (ø)
pennylane/gradients/parameter_shift.py 100.00% <100.00%> (ø)
pennylane/gradients/vjp.py 100.00% <100.00%> (ø)
pennylane/interfaces/autograd.py 100.00% <100.00%> (ø)
pennylane/interfaces/execution.py 100.00% <100.00%> (ø)
pennylane/qnode.py 100.00% <100.00%> (ø)
pennylane/math/utils.py 100.00% <0.00%> (+0.88%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Copy link
Contributor

@eddddddy eddddddy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job @rmoyard 💯!

I gave my first pass and left a few comments. Nothing too major from my side other than some questions and small suggestions.

pennylane/gradients/finite_difference.py Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Show resolved Hide resolved
pennylane/gradients/vjp.py Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/qnode.py Show resolved Hide resolved
Copy link
Contributor

@antalszava antalszava left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A massive effort, congrats @rmoyard! 👏 💪 Overall no major blockers, I've made some comments where I thought some smaller changes could be made to improve the changes.

A couple of general comments:

  • I think shapes and types have been checked at many places already and made some comments at further spots. Overall I think we have to be more careful with these than before because this is of more importance;
  • The documentation and commenting in the PR is looking top-notch and the code overall looks very clean 💯

pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Outdated Show resolved Hide resolved
tests/returntypes/test_vjp_new.py Show resolved Hide resolved
tests/returntypes/test_vjp_new.py Outdated Show resolved Hide resolved
tests/returntypes/test_vjp_new.py Show resolved Hide resolved
tests/returntypes/test_vjp_new.py Show resolved Hide resolved
tests/returntypes/test_vjp_new.py Show resolved Hide resolved
@rmoyard
Copy link
Contributor Author

rmoyard commented Oct 25, 2022

@eddddddy @antalszava It is ready for a new round of reviews!

Two things I could not solve is:

  1. @antalszava Why numpy array with shape () are converted to float, single measurment and single params. I would appreciate your input here.
  2. @eddddddy The sum with axis 0 is not working with ArrayBox and differentiation, see my comment above. I could not simplify compute_vjp more.

Also pls go through the open comments and close the one where my answer was sufficient.

Copy link
Contributor

@antalszava antalszava left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! 💪 🌟 Remaining comments are minor.

Copy link
Contributor

@eddddddy eddddddy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the changes @rmoyard 💯

Looks good to me as long as the rest of @antalszava's comments are addressed. 🙂

@rmoyard rmoyard merged commit f1e9e17 into master Oct 25, 2022
@rmoyard rmoyard deleted the return_gradient_qnode branch October 25, 2022 21:51
@rmoyard
Copy link
Contributor Author

rmoyard commented Oct 26, 2022

[sc-28429]

@eddddddy eddddddy restored the return_gradient_qnode branch October 26, 2022 18:14
@eddddddy eddddddy deleted the return_gradient_qnode branch October 26, 2022 18:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants