Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests for verifying behaviour of BatchNorm using 0-dim batch sizes. #32384

Closed
wants to merge 7 commits into from

Conversation

v0dro
Copy link
Contributor

@v0dro v0dro commented Jan 18, 2020

The BatchNorm* part of the issue (see gh-12013) seems to have been fixed in the master branch and these tests would make it concrete.

However I would appreciate comments on #12013 (comment) on whether the current behaviour is satisfactory.

@kostmo
Copy link
Member

kostmo commented Jan 18, 2020

💊 CircleCI build failures summary and remediations

As of commit d123d0e:

Commit d123d0e was recently pushed. Waiting for builds...


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker.

This comment has been revised 22 times.

@ppwwyyxx
Copy link
Contributor

ppwwyyxx commented Jan 19, 2020

There is already work in #30035. However that PR does not check that the gradients of weight & bias should be zero rather than None; it does not check moving_mean & moving_var; it does not check the gradient w.r.t input. It might be better to improve upon the existing one with extra checks.

@v0dro
Copy link
Contributor Author

v0dro commented Jan 24, 2020

I've updated the previous test.

However the exact behaviour of the bn.weight.grad is still unclear to me. Do you want to be initialized to 0s or should it stay None?

@v0dro
Copy link
Contributor Author

v0dro commented Jan 26, 2020

@ppwwyyxx can you please review this?

test/test_nn.py Outdated Show resolved Hide resolved
@rgommers rgommers changed the title Tests for batch accepting 0 dim input. Make BatchNorm accept empty batch size Jan 26, 2020
@yf225 yf225 added module: nn Related to torch.nn triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Jan 29, 2020
@v0dro
Copy link
Contributor Author

v0dro commented Jan 31, 2020

WIP for now. Will ping once I'm sure the tests are passing.

@v0dro
Copy link
Contributor Author

v0dro commented Jan 31, 2020

@rgommers can you please have a look at this one?

@v0dro
Copy link
Contributor Author

v0dro commented Feb 3, 2020

@ppwwyyxx this one seems ready to merge. Do have a look.

@v0dro v0dro changed the title Make BatchNorm accept empty batch size Tests for verfying behaviour of BatchNorm using 0-dim batch sizes. Feb 3, 2020
@ppwwyyxx
Copy link
Contributor

ppwwyyxx commented Feb 3, 2020

This looks good to me but I don't have permission to merge pytorch diffs.

@rgommers rgommers changed the title Tests for verfying behaviour of BatchNorm using 0-dim batch sizes. Tests for verifying behaviour of BatchNorm using 0-dim batch sizes. Feb 3, 2020
@rgommers rgommers requested a review from ezyang February 3, 2020 18:57
@rgommers
Copy link
Collaborator

rgommers commented Feb 3, 2020

LGTM too.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ngimel has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@ngimel merged this pull request in 5ca7bf4.

@v0dro v0dro deleted the batch-norm-0-dim branch February 4, 2020 03:48
BowenBao pushed a commit to BowenBao/pytorch that referenced this pull request Feb 12, 2020
…ytorch#32384)

Summary:
The `BatchNorm*` part of the issue (see pytorchgh-12013) seems to have been fixed in the master branch and these tests would make it concrete.

However I would appreciate comments on pytorch#12013 (comment) on whether the current behaviour is satisfactory.
Pull Request resolved: pytorch#32384

Differential Revision: D19704154

Pulled By: ngimel

fbshipit-source-id: 1bbbbf1ae1215a460b22cf26e6b263e518ecf60b
ttumiel pushed a commit to ttumiel/pytorch that referenced this pull request Mar 4, 2020
…ytorch#32384)

Summary:
The `BatchNorm*` part of the issue (see pytorchgh-12013) seems to have been fixed in the master branch and these tests would make it concrete.

However I would appreciate comments on pytorch#12013 (comment) on whether the current behaviour is satisfactory.
Pull Request resolved: pytorch#32384

Differential Revision: D19704154

Pulled By: ngimel

fbshipit-source-id: 1bbbbf1ae1215a460b22cf26e6b263e518ecf60b
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: nn Related to torch.nn open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

9 participants