New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
stft: Change require_complex warning to an error #49022
Conversation
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit a643614 (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages: pytorch_libtorch_linux_xenial_cuda11_1_cudnn8_py3_gcc7_build (1/1)Step: "Build" (full log | diagnosis details | 🔁 rerun)
|
[ghstack-poisoned]
[ghstack-poisoned]
aten/src/ATen/native/SpectralOps.cpp
Outdated
} | ||
TORCH_CHECK( | ||
return_complexOpt.has_value() || return_complex, | ||
"stft requires the return_complex parameter be explicitly " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is good. Do you think we should also take this opportunity to deprecate return_complex=False?
torch/functional.py
Outdated
Setting :attr:`return_complex` explicitly will be required in a future | ||
PyTorch release. Set it to False to preserve the current behavior or | ||
True to return a complex output. | ||
From version 1.8.0, :attr:`return_complex` must be given explicitly for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This warning seems great. Same question about whether we should deprecate return_complex=False now, too?
Seems like we should? I don't think we intend to support float tensors as surrogates for complex tensors long term.
ghstack-source-id: 0c3cabe6a4bbd883273aa888e0bdb94a1687e1bc Pull Request resolved: pytorch#49022
[ghstack-poisoned]
@mruberry I've rebased on Test failures look unrelated. |
"(which will be required in a future pytorch release). " | ||
); | ||
|
||
TORCH_WARN_ONCE( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This warning needs to be gated on the value actually being false, though. We don't want people who set return_complex=True to see it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh wait... I understand how this will work now. OK. That's cool.
So the conditional only triggers if return_complex is unspecified or False and it needs to be specified. Then the check will capture the case where it's unspecified. If the check is hit this function will stop (since it threw an exception). If not, then the value is specified but it's False, so the warning is thrown.
Cool.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool!
Hey @peterbell10, my mistake on landing this a little too swiftly. It looks like torchaudio has yet to update their stft calls. Let me ping them to do that before we land this so we don't break people using them. |
Torchaudio issue: pytorch/audio#1095 |
Summary: Pull Request resolved: pytorch#49022 Test Plan: Imported from OSS Reviewed By: ngimel Differential Revision: D25569586 Pulled By: mruberry fbshipit-source-id: 09608088f540c2c3fc70465f6a23f2aec5f24f85
BC-breaking note:
Previously torch.stft took an optional
return_complex
parameter that indicated whether the output would be a floating point tensor or a complex tensor. By defaultreturn_complex
was False to be consistent with the previous behavior of torch.stft. This PR changes this behavior soreturn_complex
is a required argument.PR Summary:
Stack from ghstack:
Differential Revision: D25658906