-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Documentation cleanup #23148
Documentation cleanup #23148
Conversation
Documentation cleanup gh-metadata: pytorch pytorch 23148 gh/zafartahirov/14/head
@pytorchbot retest this please |
Documentation cleanup gh-metadata: pytorch pytorch 23148 gh/zafartahirov/14/head
torch/nn/quantized/functional.py
Outdated
:attr:`inplace` is not supported for the quantized relu. | ||
|
||
Applies the rectified linear unit function element-wise. See | ||
:class:`~torch.nn.quantized.ReLU` for more details. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
again, torch.nn.functional.ReLU
is what you meant to type here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, about that -- don't we want to refer the functional to its module counterpart? The documentation in the torch.nn.quantized.ReLU
is more relevant to this than the non-quantized counterpart
Differential Revision: [D16414202](https://our.internmc.facebook.com/intern/diff/D16414202)
Differential Revision: [D16414202](https://our.internmc.facebook.com/intern/diff/D16414202)
Differential Revision: [D16414202](https://our.internmc.facebook.com/intern/diff/D16414202)
Differential Revision: [D16414202](https://our.internmc.facebook.com/intern/diff/D16414202)
@zafartahirov merged this pull request in 5e4c24b. |
Stack from ghstack:
Differential Revision: D16414202