You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you for your interest! The code is slightly different from the equations mentioned in our paper. In fact, when the dataset contains more classes, there is not much difference. In our code, the weight for the GT label is confidence + smoothing / num_classes.
Hello:
In the paper, I think you mean nll_loss is only for the gt label and smooth_loss is for the remaining K-1 label.
But in the code
https://github.com/Jia-Research-Lab/MiSLAS/blob/e8f91e59a910c5543ea1bcabb955ba368c606a00/methods.py#L62
I think you still contain the gt label in the smooth_loss.
I am confusing about this.
The text was updated successfully, but these errors were encountered: