Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training does not converge after joining compact bilinear layer #10

Open
roseif opened this issue Sep 27, 2020 · 3 comments
Open

Training does not converge after joining compact bilinear layer #10

roseif opened this issue Sep 27, 2020 · 3 comments

Comments

@roseif
Copy link

roseif commented Sep 27, 2020

Source code:
x = self.features(x) #[4,512,28,28]
batch_size = x.size(0)
x = (torch.bmm(x, torch.transpose(x, 1, 2)) / 28 ** 2).view(batch_size, -1)
x = torch.nn.functional.normalize(torch.sign(x) * torch.sqrt(torch.abs(x) + 1e-10))
x = self.classifiers(x)
return x
my code:
x = self.features(x) #[4,512,28,28]
x = x.view(x.shape[0], x.shape[1], -1) #[4,512,784]
x = x.permute(0, 2, 1) #[4,784,512]
x = self.mcb(x,x) #[4,784,512]
batch_size = x.size(0)
x = x.sum(1) #对于二维来说,dim=0,对列求和;dim=1对行求和;在这里是三维所以是对列求和
x = torch.nn.functional.normalize(torch.sign(x) * torch.sqrt(torch.abs(x) + 1e-10))
x = self.classifiers(x)
return x

The training does not converge after modification. Why? Is it a problem with my code?

@CHTsuperman
Copy link

Have you solved it? Can you share it?

@roseif
Copy link
Author

roseif commented Mar 15, 2022

Have you solved it? Can you share it?
The learning rate setting maybe too high. You can lower it and try again.

@CHTsuperman
Copy link

Have you solved it? Can you share it?
The learning rate setting maybe too high. You can lower it and try again.

Thank you! i will try it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants