Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert device gradients fix #2595

Merged
merged 23 commits into from
Jun 2, 2022
Merged

Revert device gradients fix #2595

merged 23 commits into from
Jun 2, 2022

Conversation

puzzleshark
Copy link
Contributor

@puzzleshark puzzleshark commented May 20, 2022

@github-actions
Copy link
Contributor

Hello. You may have forgotten to update the changelog!
Please edit doc/releases/changelog-dev.md with:

  • A one-to-two sentence description of the change. You may include a small working example for new features.
  • A link back to this PR.
  • Your name (or GitHub username) in the contributors section.

@codecov
Copy link

codecov bot commented May 20, 2022

Codecov Report

Merging #2595 (9fdbb72) into master (3fa7518) will increase coverage by 26.65%.
The diff coverage is n/a.

@@             Coverage Diff             @@
##           master    #2595       +/-   ##
===========================================
+ Coverage   72.92%   99.58%   +26.65%     
===========================================
  Files         245      245               
  Lines       19726    19724        -2     
===========================================
+ Hits        14386    19642     +5256     
+ Misses       5340       82     -5258     
Impacted Files Coverage Δ
pennylane/interfaces/autograd.py 100.00% <ø> (ø)
pennylane/interfaces/execution.py 100.00% <0.00%> (+0.78%) ⬆️
pennylane/gradients/vjp.py 100.00% <0.00%> (+1.44%) ⬆️
...mplates/subroutines/fermionic_double_excitation.py 100.00% <0.00%> (+2.14%) ⬆️
pennylane/math/quantum.py 100.00% <0.00%> (+2.38%) ⬆️
pennylane/transforms/hamiltonian_expand.py 100.00% <0.00%> (+2.77%) ⬆️
pennylane/_grad.py 100.00% <0.00%> (+3.03%) ⬆️
pennylane/transforms/split_non_commuting.py 100.00% <0.00%> (+3.33%) ⬆️
pennylane/devices/default_qubit.py 100.00% <0.00%> (+3.48%) ⬆️
pennylane/transforms/batch_input.py 100.00% <0.00%> (+4.00%) ⬆️
... and 157 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3fa7518...9fdbb72. Read the comment docs.

@antalszava antalszava self-requested a review May 20, 2022 15:39
Copy link
Contributor

@antalszava antalszava left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @puzzleshark! It would be great to add a test case to the test suite as per the suggested example from the relevant issue, such that we can be sure that the problem doesn't arise again.

Also, be sure to add a changelog item. 🙂

@antalszava antalszava self-requested a review June 2, 2022 14:50
Copy link
Contributor

@antalszava antalszava left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good 🎉

@antalszava antalszava merged commit 3344c77 into master Jun 2, 2022
@antalszava antalszava deleted the revert-device-gradients-fix branch June 2, 2022 17:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants