Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow custom activation in SqueezeExcitation of EfficientNet #4448

Merged
merged 4 commits into from
Sep 21, 2021
Merged

Allow custom activation in SqueezeExcitation of EfficientNet #4448

merged 4 commits into from
Sep 21, 2021

Conversation

kazhang
Copy link
Contributor

@kazhang kazhang commented Sep 19, 2021

Partially resolves #4333

It would be great to reuse SqueezeExcitation in RegNet implementation(#4403).

cc @datumbox

@kazhang kazhang requested a review from datumbox September 19, 2021 23:37
@kazhang kazhang marked this pull request as ready for review September 19, 2021 23:37
Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @kazhang. This is much needed and will allow us to reuse the layer in multiple places. I left a couple of comments, let me know what you think.

torchvision/models/efficientnet.py Outdated Show resolved Hide resolved
torchvision/models/efficientnet.py Outdated Show resolved Hide resolved
torchvision/models/efficientnet.py Outdated Show resolved Hide resolved
torchvision/models/efficientnet.py Show resolved Hide resolved
Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @kazhang!

@kazhang kazhang merged commit 8a83cf2 into pytorch:main Sep 21, 2021
@kazhang kazhang deleted the squeeze-excitation-activation branch September 21, 2021 19:04
facebook-github-bot pushed a commit that referenced this pull request Sep 30, 2021
…#4448)

Summary:
* allow custom activation in SqueezeExcitation

* use ReLU as the default activation

* make scale activation parameterizable

Reviewed By: datumbox

Differential Revision: D31268054

fbshipit-source-id: 0202595934dd3de5364d8ee2713324c4a4c9205e

Co-authored-by: Vasilis Vryniotis <datumbox@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[RFC] API For Common Layers In Torchvision
3 participants