Skip to content

How to do weight normalization at last layer #16207

@dwy927

Description

@dwy927

❓ Questions and Help

Please note that this issue tracker is not a help form and this issue will be closed.

We have a set of listed resources available on the website. Our primary means of support is our discussion forum:

Many loss functions in face recognition will norm feature or norm weight before calculate softmax loss, such as normface(https://arxiv.org/abs/1704.06369), L2-softmax(https://arxiv.org/abs/1703.09507) and so on.

I’d like to know how to norm weight in the last classification layer.

self.feature =  torch.nn.Linear(7*7*64, 2) # Feature extract layer
self.pred = torch.nn.Linear(2, 10, bias=False) # Classification layer

I want to replace the weight parameter in self.pred module with a normalized one.
In another word, I want to replace weight in-place, like this:

self.pred.weight = self.pred.weight / torch.norm(self.pred.weight, dim=1, keepdim=True)

When I trying to do this, there is something wrong:

TypeError: cannot assign 'torch.FloatTensor' as parameter 'weight' (torch.nn.Parameter or None expected)

I am new comer to pytorch, I don’t know what is the standard way to handle this. Thanks a lot!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions