Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning: Mixed memory format inputs detected while calling the operator. #6

Closed
Pluto1314 opened this issue Oct 19, 2020 · 4 comments
Closed

Comments

@Pluto1314
Copy link

I have added lambda layers to every block of Resnet, but the following warning will appear. Will it affect the result?

Warning: Mixed memory format inputs detected while calling the operator. The operator will output channels_last tensor even if some of the inputs are not in channels_last format. (function operator())

@lucidrains
Copy link
Owner

Could you try with a dim_u greater than 1 and see if that error goes away?

@Pluto1314
Copy link
Author

class BasicBlock(nn.Module):        # resnet18, resnet34
    expansion = 1

    def __init__(self, c1, c2, stride=1, downsample=None):
        super(BasicBlock, self).__init__()
        self.conv1 = conv3x3(c1, c2, stride)
        self.bn1 = nn.BatchNorm2d(c2)
        self.relu = nn.ReLU(inplace=True)
        self.conv2 = conv3x3(c2, c2)
        self.bn2 = nn.BatchNorm2d(c2)
        self.downsample = downsample
        self.stride = stride

        # lambda layers
        self.lambda_layers = LambdaLayer(
            dim = c2,
            dim_out= c2,
            r = 23,
            dim_k = 16,
            heads = 4,
            dim_u = 4
        )
    
    def forward(self, x):
        identity = x

        out = self.conv1(x)
        out = self.bn1(out)
        out = self.relu(out)

        out = self.conv2(out)
        out = self.bn2(out)

        if self.downsample is not None:
            identity = self.downsample(x)
        
        # lambda layers
        out = self.lambda_layers(out)
        
        out += identity
        out = self.relu(out)
        return out

yes, dim_u = 4.
at the same time, I don’t know if I changed it right.

@lucidrains
Copy link
Owner

@Pluto1314 ok, should be fixed in the latest! b6c8874

@Pluto1314
Copy link
Author

Thanks, everything is working now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants