Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question of supervised train loss #10

Closed
Chris-WangQ opened this issue Jun 24, 2022 · 1 comment
Closed

Question of supervised train loss #10

Chris-WangQ opened this issue Jun 24, 2022 · 1 comment

Comments

@Chris-WangQ
Copy link

Chris-WangQ commented Jun 24, 2022

Hello,
In the "model.py" , I finded the loss function of supervised training is CrossEntropyLoss. With hard negatives, why not select TripletMarginLoss?
Looking forward to your reply, thanks!

@kongds
Copy link
Owner

kongds commented Jun 24, 2022

hello,

CrossEntropyLoss is implemented as contrastive learning (including hard negative) in supervised training following SimCSE.
The overall training objective is:
image

And I think contrastive learning is more efficient than TripletMarginLoss as shown by SimCSE.

@kongds kongds closed this as completed Aug 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants