You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The forward pass of the GeneratorNet network and Net network (the feature extractor) multiply A_rebuild with self.s which is set to 10 and then passes it to the classifier.
Can you explain what this is? Why can't we pass it directly to the FC layer for classification?
The text was updated successfully, but these errors were encountered:
self.s corresponds to the learnable temperature \alpha described in our paper.
To my knowledge, the term "temperature" comes from [1], and is adopted by [2](which is also cited by us) in few-shot learning.
In a word, softmax w/ learnable temperature \alpha is easier to optimize.
You can refer to [1, 2] for more details.
[1] Distilling the Knowledge in a Neural Network.
[2] Low-shot learning with imprinted weights.
The forward pass of the
GeneratorNet
network andNet
network (the feature extractor) multiplyA_rebuild
withself.s
which is set to10
and then passes it to the classifier.Can you explain what this is? Why can't we pass it directly to the FC layer for classification?
The text was updated successfully, but these errors were encountered: