Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add fft submodule to qml.math #1440

Merged
merged 20 commits into from
Feb 9, 2023
Merged

Add fft submodule to qml.math #1440

merged 20 commits into from
Feb 9, 2023

Conversation

josh146
Copy link
Member

@josh146 josh146 commented Jun 30, 2021

No description provided.

@josh146 josh146 added the WIP 🚧 Work-in-progress label Jun 30, 2021
@github-actions
Copy link
Contributor

Hello. You may have forgotten to update the changelog!
Please edit .github/CHANGELOG.md with:

  • A one-to-two sentence description of the change. You may include a small working example for new features.
  • A link back to this PR.
  • Your name (or GitHub username) in the contributors section.

@mariaschuld
Copy link
Contributor

Nice! I wonder if the numpy_mimic is causing the tests to fail?

And what would we need to do, just add tests? Or are the math module functions untested?

@josh146
Copy link
Member Author

josh146 commented Jun 30, 2021

Nice! I wonder if the numpy_mimic is causing the tests to fail?

Tests are failing because I stupidly passed self to super(), I forget that you don't have to and that is automatic 😆

Just fixed that, so tests should pass now.

And what would we need to do, just add tests? Or are the math module functions untested?

  • Finish ensuring that all TF/Torch FFT functions that are needed are accessible. For now, I imagine just the main ones (fft, ifft, fft2 etc.) as well as some of the helper function (fftshift) should be sufficient?
  • Writing tests 🙂

@codecov
Copy link

codecov bot commented Jun 30, 2021

Codecov Report

Merging #1440 (615ecd8) into master (03e7f50) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff           @@
##           master    #1440   +/-   ##
=======================================
  Coverage   99.82%   99.82%           
=======================================
  Files         329      329           
  Lines       28822    28848   +26     
=======================================
+ Hits        28771    28797   +26     
  Misses         51       51           
Impacted Files Coverage Δ
pennylane/math/__init__.py 100.00% <100.00%> (ø)
pennylane/math/single_dispatch.py 99.72% <100.00%> (+0.01%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@josh146
Copy link
Member Author

josh146 commented Dec 16, 2021

@glassnotes @mariaschuld @dwierichs... is this still needed? I have completely forgotten the context to this PR :(

@dwierichs
Copy link
Contributor

It seems like it is almost done? We have not talked much about differentiability of parameter shift rules or reconstruction functionalities with respect to the frequency or reconstruction/evaluation points since summer. It might be nice to leave this open and merge it in once we want some functionality like this, but for now it does not appear to be needed?

@josh146
Copy link
Member Author

josh146 commented Dec 16, 2021

@dwierichs yes as far as I can tell, it only needs tests. Okay, let's leave it open for now

@josh146
Copy link
Member Author

josh146 commented Jan 5, 2023

@dwierichs is this something that is now worth adding?

@dwierichs
Copy link
Contributor

I tried to write tests for this.
Can't get the right result in Tensorflow so far, and the following code breaks my Python instance:

import torch
def real_fft(x):
    return torch.real(torch.fft.fft(x))

x = torch.tensor([0.5], requires_grad=True)
torch.autograd.functional.jacobian(real_fft, x)

@josh146
Copy link
Member Author

josh146 commented Jan 30, 2023

@dwierichs did you get the tests working based on the recent commit?

@dwierichs
Copy link
Contributor

@josh146 Yes. Complex-valued derivatives are a bit strange in Tensorflow, to me at least. And the torch problem was self-made, I had some local version collision.
This PR should be ready to go soonish, modulo copying the now working tests to the other three registered functions.

@dwierichs
Copy link
Contributor

This could use some reviews now 😅 @josh146

@josh146
Copy link
Member Author

josh146 commented Feb 6, 2023

Nice work @dwierichs! All looks good to me, and very thorough test coverage. Unfortunately, I cannot approve as I opened this PR 😆

@dwierichs dwierichs changed the title [WIP] Add support for differentiable FFT Add fft submodule to qml.math Feb 6, 2023
@dwierichs dwierichs added interface 🔌 Classical machine-learning interfaces and removed WIP 🚧 Work-in-progress labels Feb 8, 2023
@dwierichs dwierichs enabled auto-merge (squash) February 9, 2023 11:07
@dwierichs dwierichs merged commit 4df0f69 into master Feb 9, 2023
@dwierichs dwierichs deleted the ffty branch February 9, 2023 11:23
mudit2812 pushed a commit that referenced this pull request Apr 13, 2023
* [WIP] Add support for differentiable FFT

* fix

* fix

* fix

* lint and black

* start tests

* working tests

* lint 1/2

* lint tests

* complete tests

* add file

* test update

* changelog

* coverage

* lint

---------

Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
interface 🔌 Classical machine-learning interfaces
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants