RBF-Softmax is a simple but effective image classification loss function of deep neural networks. This RBF-Softmax project written in PyTorch and modified from pycls.
In RBF-Softmax, logits are calculated by RBF kernel and then scale by a hyperparameter. So here the weights in last FC are treated as class prototypes.
The MNIST toy demo visualization of RBF-Softmax and other losses.
Following gif is the 2D feature visualization of RBF-Softmax trained on MNIST. With the training conducting, the inner class distances become smaller and smaller.
-
Since RBF-Softmax use pycls as codebase, most of the installation follows pycls. Please refer
INSTALL.md
for installation instructions. -
After installation, please see
GETTING_STARTED.md
for basic instructions and example commands on training and evaluation with RBF-Softmax.
We provide a some final results and pretrained models available for download in the Model Zoo. Note in paper we report the best top-1 performances during whole training.
If you find RBF-Softmax helpful in your research, please consider citing:
@InProceedings{xzhang2020rbf,
title = {RBF-Softmax: Learning Deep Representative Prototypes with Radial Basis Function Softmax},
author = {Zhang, Xiao and Zhao, Rui and Qiao, Yu and Li, Hongsheng},
booktitle = {ECCV},
year = {2020}
}
RBF-Softmax is licensed under the MIT license. Please see the LICENSE file for more information.