Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AdaptiveAvgPool2d #27

Closed
shrutishrestha opened this issue Oct 15, 2020 · 5 comments
Closed

AdaptiveAvgPool2d #27

shrutishrestha opened this issue Oct 15, 2020 · 5 comments

Comments

@shrutishrestha
Copy link

shrutishrestha commented Oct 15, 2020

Why is this nn.AdaptiveAvgPool2d(1) done here?

class ASPPPooling(nn.Sequential):

def __init__(self, in_channels, out_channels):
    super(ASPPPooling, self).__init__(
        nn.AdaptiveAvgPool2d(1),
        nn.Conv2d(in_channels, out_channels, 1, bias=False),
        nn.BatchNorm2d(out_channels),
        nn.ReLU(inplace=True))

def forward(self, x):
    size = x.shape[-2:]
    x = super(ASPPPooling, self).forward(x)
    return F.interpolate(x, size=size, mode='bilinear', align_corners=False)

I am doing segmentation task and this abive pooling changes my output from torch.Size([1, 256, 16, 16]) to torch.Size([1, 256, 1, 1])
giving the error,
"Expected more than 1 value per channel when training, got input size torch.Size([1, 256, 1, 1])"

What could have gone wrong?

@Haienzi
Copy link

Haienzi commented Feb 17, 2021

I have the same problem. How did you solve it?

1 similar comment
@Nuller-CV
Copy link

I have the same problem. How did you solve it?

@shrutishrestha
Copy link
Author

I have the same problem. How did you solve it?

Hi, how do you want your output? The line "nn.AdaptiveAvgPool2d(1)" means 1 is the output size and your output will be of height 1 and width 1.

@VainF
Copy link
Owner

VainF commented Apr 12, 2021

BatchNorm requires at least 2 values per channel for mean and variance estimation. So, in training mode, please make sure that your batch_size is larger than 1.

@DISAPPEARED13
Copy link

what if batch_size is 1, can I just replace nn.BatchNorm with nn.InstanceNorm?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants