-
Notifications
You must be signed in to change notification settings - Fork 575
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support backprop in default.mixed
device for PyTorch
#2680
Conversation
Co-authored-by: antalszava <antalszava@gmail.com>
Co-authored-by: antalszava <antalszava@gmail.com>
Hello. You may have forgotten to update the changelog!
|
Codecov Report
@@ Coverage Diff @@
## master #2680 +/- ##
=======================================
Coverage 99.61% 99.61%
=======================================
Files 251 251
Lines 20647 20667 +20
=======================================
+ Hits 20567 20587 +20
Misses 80 80
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good @eddddddy! 🎉 🥳
One main feedback is the fact that users might not able to specify Torch devices in a versatile manner, as they would with default.qubit.torch
. Having said that, it may be worth merging this as is because it should work for CPU and then another PR could address adding GPU support. This should just be documented and explained well (changelog and/or device docs).
Co-authored-by: antalszava <antalszava@gmail.com>
[sc-21103] |
Context:
The third PR in a series to support backpropagation in the
default.mixed
device.Description of the Change:
Now backpropagation is supported in the
default.mixed
device for the PyTorch interface.Benefits:
Better integration with auto-differentiation frameworks.
Possible Drawbacks:
Related GitHub Issues:
#1528