New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a warning message that torch.sign would not support complex numbers #43280
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit e36d887 (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages: pytorch_linux_xenial_py3_clang5_mobile_build (1/1)Step: "Build" (full log | diagnosis details | 🔁 rerun)
|
aten/src/ATen/native/UnaryOps.cpp
Outdated
@@ -266,7 +266,10 @@ Tensor& rsqrt_out(Tensor& result, const Tensor& self) { return unary_op_impl_out | |||
Tensor rsqrt(const Tensor& self) { return unary_op_impl(self, at::rsqrt_out); } | |||
Tensor& rsqrt_(Tensor& self) { return unary_op_impl_(self, at::rsqrt_out); } | |||
|
|||
Tensor& sign_out(Tensor& result, const Tensor& self) { return unary_op_impl_out(result, self, sign_stub); } | |||
Tensor& sign_out(Tensor& result, const Tensor& self) { | |||
TORCH_CHECK(!self.is_complex(), "Unlike NumPy, torch.sign is not intended to support complex numbers."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we need to add this. I think the current error is sufficient.
>>> a=torch.randn(4, dtype=torch.cdouble)
>>> a.sign()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
RuntimeError: "sign_cpu" not implemented for 'ComplexDouble'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps you can change the message in your #39955 ? This at least points users to the proper function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry I don't follow. which message are you suggesting to update?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This message that I added here: "Unlike NumPy, torch.sign is not intended to support complex numbers. Please use sgn instead."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we generally add these warning messages unless we are breaking BC or something. cc. @mruberry
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Redirecting users to look at torch.sgn (when it becomes available) seems like a helpful error message. It could be separated from the torch.sgn PR (as a follow-up like this). I would wait to add an error message here until torch.sgn is available, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xuhdev can you update the error message to what you mentioned above:
"Unlike NumPy, torch.sign is not intended to support complex tensors. Please use torch.sgn() instead."
@@ -5985,7 +5985,9 @@ def test_complex_assert_raises(self, device): | |||
self.assertRaises(RuntimeError, | |||
lambda: zeros.index_add(0, torch.arange(0, size[0], dtype=torch.long, device=device), tensor)) | |||
|
|||
self.assertRaises(RuntimeError, lambda: torch.sign(torch.tensor([4j], device=device, dtype=dtype))) | |||
with self.assertRaisesRegex(RuntimeError, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks good to have a test to check that sign throws an error for complex.
…mplex numbers" [ghstack-poisoned]
ghstack-source-id: 89386811e2ec936ee883d29572770d9be8178d6b Pull Request resolved: #43280
…mplex numbers" [ghstack-poisoned]
ghstack-source-id: 864a3a29e115954730bcafd943f81800f3838fff Pull Request resolved: #43280
@mruberry @anjali411 Do you think this is ready to merge now? #39955 has been merged. |
yes looks good after the error message is updated to reference |
…mplex numbers" [ghstack-poisoned]
ghstack-source-id: 684f326cd693d4473c08b4ff7ce6025615e046ec Pull Request resolved: #43280
@anjali411 I've added the reference now |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
…mplex numbers" [ghstack-poisoned]
…mplex numbers" [ghstack-poisoned]
ghstack-source-id: c8ddbac2c6fcf54c12d4b5cd82a4032f81e59f49 Pull Request resolved: #43280
Updated the PR to fix an error in the test file |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @xuhdev!
@anjali411 merged this pull request in bcbb6ba. |
1 similar comment
@anjali411 merged this pull request in bcbb6ba. |
@anjali411 merged this pull request in bcbb6ba. |
Stack from ghstack:
Differential Revision: D24538769