-
Notifications
You must be signed in to change notification settings - Fork 25.2k
Closed
Description
❓ Questions and Help
Please note that this issue tracker is not a help form and this issue will be closed.
We have a set of listed resources available on the website. Our primary means of support is our discussion forum:
- Discussion Forum
https://discuss.pytorch.org/t/how-to-do-weight-normalization-in-classification-layer/35193?u=wenying_dai
Many loss functions in face recognition will norm feature or norm weight before calculate softmax loss, such as normface(https://arxiv.org/abs/1704.06369), L2-softmax(https://arxiv.org/abs/1703.09507) and so on.
I’d like to know how to norm weight in the last classification layer.
self.feature = torch.nn.Linear(7*7*64, 2) # Feature extract layer
self.pred = torch.nn.Linear(2, 10, bias=False) # Classification layer
I want to replace the weight parameter in self.pred module with a normalized one.
In another word, I want to replace weight in-place, like this:
self.pred.weight = self.pred.weight / torch.norm(self.pred.weight, dim=1, keepdim=True)
When I trying to do this, there is something wrong:
TypeError: cannot assign 'torch.FloatTensor' as parameter 'weight' (torch.nn.Parameter or None expected)
I am new comer to pytorch, I don’t know what is the standard way to handle this. Thanks a lot!
Metadata
Metadata
Assignees
Labels
No labels