-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FreeMatch SAF loss is a negative value? #182
Comments
Negative loss doesn't affect the gradient |
thank you for your fast reply! |
The loss encourages fairness on average predictions. We expect the average predictions to be close to uniform. Reflected on the entropy loss, it corresponds to maximizing the entropy. We replace the target term with a momentum-smoothed average prediction for stability. That's the intuition behind this loss. I think the paper and relevant works mentioned in the paper discussed this in more detail. |
thank you for explaining it! |
Bug
In your FreeMatch paper equation 11, the SAF loss is a negative value. Then the SAF loss is added to the total loss. Isnt this causing SumNorm (p˜t/h˜t) and SumNorm(p¯/h¯) to be more dis-simmilar?
�
Reproduce the Bug
Error Messages and Logs
The text was updated successfully, but these errors were encountered: