-
Notifications
You must be signed in to change notification settings - Fork 569
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Return-types #7] QNode integration with custom gradient and autograd #3041
Conversation
Hello. You may have forgotten to update the changelog!
|
Codecov Report
@@ Coverage Diff @@
## master #3041 +/- ##
========================================
Coverage 99.69% 99.69%
========================================
Files 275 275
Lines 24012 24170 +158
========================================
+ Hits 23938 24097 +159
+ Misses 74 73 -1
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
…o return_finite_diff
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great job @rmoyard 💯!
I gave my first pass and left a few comments. Nothing too major from my side other than some questions and small suggestions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A massive effort, congrats @rmoyard! 👏 💪 Overall no major blockers, I've made some comments where I thought some smaller changes could be made to improve the changes.
A couple of general comments:
- I think shapes and types have been checked at many places already and made some comments at further spots. Overall I think we have to be more careful with these than before because this is of more importance;
- The documentation and commenting in the PR is looking top-notch and the code overall looks very clean 💯
@eddddddy @antalszava It is ready for a new round of reviews! Two things I could not solve is:
Also pls go through the open comments and close the one where my answer was sufficient. |
…I/pennylane into return_gradient_qnode
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! 💪 🌟 Remaining comments are minor.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the changes @rmoyard 💯
Looks good to me as long as the rest of @antalszava's comments are addressed. 🙂
[sc-28429] |
Context:
This PR makes possible to use return types with QNodes and custom derivatives (finite-diff, param shift and adjoint) with Autograd.
Description of the Change:
custom_vjp
functions which is now separated between multi and single measurmentcustom_vjp_single
,custom_vjp_multi
.test_autograd.py
,test_autograd_qnode.py
andtest_vjp.py
Benefits:
The new return types system is almost fully compatible with Autograd. (no ragged arrays)
Possible Drawbacks:
Users must apply
autograd.numpy.hstack
for multiple measurements or for second derivative.Potential slowdown due to post-processing.
TODO:
[sc-25813]