-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use old and new fft in one program #49695
Comments
Hi @boeddeker sorry you are facing the issue. This is the expected effect of introducing torch.fft module. fft migration wiki unfortunately does not have a solution to your problem https://github.com/pytorch/pytorch/wiki/The-torch.fft-module-in-PyTorch-1.7. |
Sorry you're experiencing this issue, @boeddeker. The wiki article has a few snippets that are relevant here. For example, you can use this pattern:
Making the module callable was considered but we wanted to remove the older torch.fft(), not continue to support it, and it would have required changes to Torchscript to support it. Would this pattern work for you? Also, torch.fft.fft() is available starting in 1.7, so another pattern is to require the torch.fft module be available if you only supporting PyTorch 1.7+. |
Thank you for the suggestions. I was hoping to get a longer deprecation time and be able to port part by part.
Was it considered keeping the old |
No and no. We really want people to upgrade to complex tensors and stop using float tensors mimicking complex tensors.
I'm sympathetic to this feeling and I'm sorry the change seems so abrupt. This was an unfortunate case where the function vs module conflict was challenging to resolve. The deprecation warning has been in nightlies for 4 months.
You can create a helper, like
|
I recognized it some time ago, but I ignored it, because I observed some issues with complex number support in pytorch (missing support in functions for cpu or gpu or gradient support).
Thank you. That example is helpful to compare the old code with the new code. |
@boeddeker please let us know if you run into missing complex number support -- we clearly don't want to push people to use incomplete APIs, so it would help us prioritize if you run into any issues. |
Thank you for your Python example. What if immigrating fft of PyTorch C++ API? |
I believe you would use the same logic. All the functions used in the above example are available in the C++ API, too. |
the example my_fft() above fails for me with: Tensor must have a last dimension of size 2
like this:
it fails as well |
@AGenchev the example you copied is for |
@peterbell10 my error, thank you for your answer. |
Closing this issue because I believe the initial question has been addressed. |
馃悰 Bug
In
1.8.0.dev20201221+cpu
the defaulttorch.fft
changed, i.e. it is now a module and not a callable.It is now very challenging to write code that works with nightly and older PyTorch versions.
To Reproduce
Steps to reproduce the behavior:
The call
torch.fft(...)
should not fail withTypeError: 'module' object is not callable
:Expected behavior
Support old and new fft in one program
Possible Solution
There is a python hack to support callable python modules:
https://stackoverflow.com/a/48100440/5766934
Environment
conda
,pip
, source): pipAdditional context
#42175 looks like the main issue for the fft change, so maybe @mruberry or @peterbell10 have an opinion to this.
cc @mruberry @peterbell10 @walterddr
The text was updated successfully, but these errors were encountered: