Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change activation modules in C++ from using Tensor& to Tensor #28501

Closed
wants to merge 2 commits into from

Conversation

@t-vi
Copy link
Collaborator

t-vi commented Oct 23, 2019

Sequential does not like modules added to it to take Tensor&
(const Tensor& and Tensor are both OK).
Functional and others use Tensor when they want to potentially
change things in-place.
This changes ReLU and friends to also do that.

Unfortunately, this seems to be BC breaking on the ABI level.
On the other hand, use of the module ReLU seems rare enough outside
Sequential (in particular in C++ models, the standard seems to be
to use torch::relu instead).

is the BC breaking OK here? (@yf225 or anyone else)

Sequential does not like modules added to it to take Tensor&
(const Tensor& and Tensor are both OK).
Functional and others use Tensor when they want to potentially
change things in-place.
This changes ReLU and friends to also do that.

Unfortunately, this seems to be BC breaking on the ABI level.
On the other hand, use of the module ReLU seems rare enough outside
Sequential (in particular in C++ models, the standard seems to be
to use torch::relu instead).
@t-vi t-vi requested review from ebetica, goldsborough and yf225 as code owners Oct 23, 2019
Copy link
Contributor

yf225 left a comment

@t-vi Thanks a lot for the fix! I think changing from Tensor forward(Tensor& input) to Tensor forward(Tensor input) works great :) I left a minor comment.

test/cpp/api/sequential.cpp Outdated Show resolved Hide resolved
Thank you, Will

Co-Authored-By: Will Feng <yf225@cornell.edu>
@yf225
yf225 approved these changes Oct 23, 2019
Copy link
Contributor

yf225 left a comment

@t-vi Thanks so much!

Copy link
Contributor

facebook-github-bot left a comment

@yf225 is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot

This comment has been minimized.

Copy link
Contributor

facebook-github-bot commented Oct 23, 2019

@yf225 merged this pull request in 09ad464.

@yf225 yf225 added the module: cpp label Oct 28, 2019
v0dro added a commit to v0dro/pytorch that referenced this pull request Nov 25, 2019
…h#28501)

Summary:
Sequential does not like modules added to it to take Tensor&
(const Tensor& and Tensor are both OK).
Functional and others use Tensor when they want to potentially
change things in-place.
This changes ReLU and friends to also do that.

Unfortunately, this seems to be BC breaking on the ABI level.
On the other hand, use of the module ReLU seems rare enough outside
Sequential (in particular in C++ models, the standard seems to be
to use torch::relu instead).

is the BC breaking OK here? (yf225 or anyone else)
Pull Request resolved: pytorch#28501

Differential Revision: D18089978

Pulled By: yf225

fbshipit-source-id: ac9aba6dc2081117dece57cd8a15bafe14ec8f51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.