Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dropout2d doesn't drop channels for (C, H, W) #69801

Open
OverLordGoldDragon opened this issue Dec 11, 2021 · 5 comments
Open

Dropout2d doesn't drop channels for (C, H, W) #69801

OverLordGoldDragon opened this issue Dec 11, 2021 · 5 comments
Assignees
Labels
high priority module: correctness (silent) issue that returns an incorrect result silently module: nn Related to torch.nn triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@OverLordGoldDragon
Copy link
Contributor

OverLordGoldDragon commented Dec 11, 2021

馃悰 Describe the bug

Docs read that C is dropped, which does not occur for (C, H, W)

import torch
torch.manual_seed(0)
ipt = torch.ones((2, 3, 4))
print(torch.nn.Dropout2d(p=0.5)(ipt))
tensor([[[2., 2., 2., 2.],
         [0., 0., 0., 0.],
         [0., 0., 0., 0.]],
        [[2., 2., 2., 2.],
         [2., 2., 2., 2.],
         [0., 0., 0., 0.]]])

but does for (N, C, H, W)

torch.manual_seed(0)
print(torch.nn.Dropout2d(p=0.5)(ipt[None]))
tensor([[[[2., 2., 2., 2.],
          [2., 2., 2., 2.],
          [2., 2., 2., 2.]],
         [[0., 0., 0., 0.],
          [0., 0., 0., 0.],
          [0., 0., 0., 0.]]]])

It appears Dropout2d is instead implementing Dropout1d for (N, C, T).

Versions

1.10.0 via Anaconda; Windows 10

cc @ezyang @gchanan @zou3519 @bdhirsh @albanD @mruberry @jbschlosser @walterddr @kshitij12345

@kshitij12345
Copy link
Collaborator

Even Dropout3d is affected with the same issue. Seems like the underlying implementation always assumes that the input will be batched.

sizes.push_back(input_sizes[0]);
sizes.push_back(input_sizes[1]);

@kshitij12345 kshitij12345 added the module: nn Related to torch.nn label Dec 13, 2021
@albanD
Copy link
Collaborator

albanD commented Dec 13, 2021

cc @jbschlosser should this be fixed when no batch dim support is added?

@kshitij12345
Copy link
Collaborator

FYI, as per the tracker issue #60585 it is already supported.

@albanD
Copy link
Collaborator

albanD commented Dec 13, 2021

Ok, so most likely a bug then!

Tentative high pri

@albanD albanD added high priority triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Dec 13, 2021
@kshitij12345 kshitij12345 self-assigned this Dec 13, 2021
@jbschlosser jbschlosser added the module: correctness (silent) issue that returns an incorrect result silently label Dec 13, 2021
facebook-github-bot pushed a commit that referenced this issue Feb 2, 2022
Summary:
Fixes #69801

TODO:
* [x] Update C++ API

cc albanD mruberry jbschlosser walterddr kshitij12345

Pull Request resolved: #69885

Reviewed By: mruberry

Differential Revision: D33175470

Pulled By: jbschlosser

fbshipit-source-id: c9d7d9e0f59ba290a0157725c338a345f3d58b9f
cyyever pushed a commit to cyyever/pytorch_private that referenced this issue Feb 3, 2022
Summary:
Fixes pytorch/pytorch#69801

TODO:
* [x] Update C++ API

cc albanD mruberry jbschlosser walterddr kshitij12345

Pull Request resolved: pytorch/pytorch#69885

Reviewed By: mruberry

Differential Revision: D33175470

Pulled By: jbschlosser

fbshipit-source-id: c9d7d9e0f59ba290a0157725c338a345f3d58b9f
(cherry picked from commit 7e4271a)
cyyever pushed a commit to cyyever/pytorch_private that referenced this issue Feb 3, 2022
Summary:
Fixes pytorch/pytorch#69801

TODO:
* [x] Update C++ API

cc albanD mruberry jbschlosser walterddr kshitij12345

Pull Request resolved: pytorch/pytorch#69885

Reviewed By: mruberry

Differential Revision: D33175470

Pulled By: jbschlosser

fbshipit-source-id: c9d7d9e0f59ba290a0157725c338a345f3d58b9f
(cherry picked from commit 7e4271a)
cyyever pushed a commit to cyyever/pytorch_private that referenced this issue Feb 9, 2022
Summary:
Fixes pytorch/pytorch#69801

TODO:
* [x] Update C++ API

cc albanD mruberry jbschlosser walterddr kshitij12345

Pull Request resolved: pytorch/pytorch#69885

Reviewed By: mruberry

Differential Revision: D33175470

Pulled By: jbschlosser

fbshipit-source-id: c9d7d9e0f59ba290a0157725c338a345f3d58b9f
(cherry picked from commit 7e4271a)
cyyever pushed a commit to cyyever/pytorch_private that referenced this issue Feb 9, 2022
Summary:
Fixes pytorch/pytorch#69801

TODO:
* [x] Update C++ API

cc albanD mruberry jbschlosser walterddr kshitij12345

Pull Request resolved: pytorch/pytorch#69885

Reviewed By: mruberry

Differential Revision: D33175470

Pulled By: jbschlosser

fbshipit-source-id: c9d7d9e0f59ba290a0157725c338a345f3d58b9f
(cherry picked from commit 7e4271a)
@jbschlosser
Copy link
Contributor

Reopening this thanks to #79549

TL;DR: The fix for this issue was silently BC-breaking for those who depended on Dropout2d to do 1D channel-wise dropout for 3D inputs. For some reason, Dropout1d didn't exist before, so Dropout2d was the only option. We'll revert to the old behavior for 1.12 with a warning to move to the newly-added Dropout1d. In a release, we'll switch back to a no-batch-dim interpretation for 3D inputs.

@jbschlosser jbschlosser reopened this Jun 15, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
high priority module: correctness (silent) issue that returns an incorrect result silently module: nn Related to torch.nn triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants