Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

batch size 小的时候使用 SigmoidContrastLoss 会好一点,大的时候 SoftmaxContrastLoss 好一点? #64

Open
NLPJCL opened this issue Jul 23, 2023 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@NLPJCL
Copy link

NLPJCL commented Jul 23, 2023

🐛 bug 说明

batch size 小的时候使用 SigmoidContrastLoss 会好一点,大的时候 SoftmaxContrastLoss 好一点。请问这个是做实验了还是有理论分析呢?理论上,第一个loss是一个query,对于一个正例,做batch_size-1个二分类任务(batch_size-1)个负例。
而SoftmaxContrastLoss,一个query,对于一个正例,做一个多分类任务(也是见到batch_size-1个负例)。从负例的角度没差别呀?

Python Version

None

@NLPJCL NLPJCL added the bug Something isn't working label Jul 23, 2023
@wangyuxinwhy
Copy link
Owner

只是实验上的”粗糙“结论,你的理解是对的,这两个 Loss 之间没有太大差异,就是多个二分类和一个多分类之间的区别。

@NLPJCL
Copy link
Author

NLPJCL commented Jul 24, 2023

感谢🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants