Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

parameter numbers of the entire model? #4

Closed
Sunting78 opened this issue Jun 23, 2022 · 2 comments
Closed

parameter numbers of the entire model? #4

Sunting78 opened this issue Jun 23, 2022 · 2 comments

Comments

@Sunting78
Copy link

Thanks for the great work !
I want to ask a question that confuses me. In the Paper, Table 4 shows that the parameter numbers of the entire model is not increased. However, i see that in the code the prototype is inceased by class_num as follow:
self.prototypes = nn.Parameter(torch.zeros(self.num_classes, self.num_prototype, in_channels), requires_grad=True)
If class_num is increased, the parameter numbers of prototypes are also increased. Did I get it wrong?

@tfzhou
Copy link
Owner

tfzhou commented Jun 23, 2022

Hi @Sunting78, Thanks for your interests. Table 4 actually report the number of learnable parameters, while in our model, self.prototypes are non-learnable. P.s., though requires_grad is set to True in the code you refer to, it will be changed to False immediately in the forward function. It is same to directly set requires_grad=False in the definition.

@Sunting78
Copy link
Author

@tfzhou Thanks for your reply. I got it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants