Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.
Sign upChange activation modules in C++ from using Tensor& to Tensor #28501
Conversation
Sequential does not like modules added to it to take Tensor& (const Tensor& and Tensor are both OK). Functional and others use Tensor when they want to potentially change things in-place. This changes ReLU and friends to also do that. Unfortunately, this seems to be BC breaking on the ABI level. On the other hand, use of the module ReLU seems rare enough outside Sequential (in particular in C++ models, the standard seems to be to use torch::relu instead).
Thank you, Will Co-Authored-By: Will Feng <yf225@cornell.edu>
@yf225 is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
This comment has been minimized.
This comment has been minimized.
v0dro
added a commit
to v0dro/pytorch
that referenced
this pull request
Nov 25, 2019
…h#28501) Summary: Sequential does not like modules added to it to take Tensor& (const Tensor& and Tensor are both OK). Functional and others use Tensor when they want to potentially change things in-place. This changes ReLU and friends to also do that. Unfortunately, this seems to be BC breaking on the ABI level. On the other hand, use of the module ReLU seems rare enough outside Sequential (in particular in C++ models, the standard seems to be to use torch::relu instead). is the BC breaking OK here? (yf225 or anyone else) Pull Request resolved: pytorch#28501 Differential Revision: D18089978 Pulled By: yf225 fbshipit-source-id: ac9aba6dc2081117dece57cd8a15bafe14ec8f51
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
t-vi commentedOct 23, 2019
Sequential does not like modules added to it to take Tensor&
(const Tensor& and Tensor are both OK).
Functional and others use Tensor when they want to potentially
change things in-place.
This changes ReLU and friends to also do that.
Unfortunately, this seems to be BC breaking on the ABI level.
On the other hand, use of the module ReLU seems rare enough outside
Sequential (in particular in C++ models, the standard seems to be
to use torch::relu instead).
is the BC breaking OK here? (@yf225 or anyone else)