Skip to content

This is a pytorch implementation of the am_softmax, this softmax layer includes the class assignment fully connected layer, as it is required for it to be normalized.

Notifications You must be signed in to change notification settings

dalisson/am_softmax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 

Repository files navigation

Pytorch implementation of additive margin softmax as seen on original paper
https://arxiv.org/pdf/1801.05599.pdf
This is a loss function designed for embedding learning as it encourages intra-class distances to be small and inter-class distances to be larger, so the input to the layer should be a batchXembedding tensor and the output will be softmax logits which can then be passed to a NLL implementation, eg. input size 64X512 means 64 embeddings of size 512 each.

About

This is a pytorch implementation of the am_softmax, this softmax layer includes the class assignment fully connected layer, as it is required for it to be normalized.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages