You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
In the "model.py" , I finded the loss function of supervised training is CrossEntropyLoss. With hard negatives, why not select TripletMarginLoss?
Looking forward to your reply, thanks!
The text was updated successfully, but these errors were encountered:
CrossEntropyLoss is implemented as contrastive learning (including hard negative) in supervised training following SimCSE.
The overall training objective is:
And I think contrastive learning is more efficient than TripletMarginLoss as shown by SimCSE.
Hello,
In the "model.py" , I finded the loss function of supervised training is CrossEntropyLoss. With hard negatives, why not select TripletMarginLoss?
Looking forward to your reply, thanks!
The text was updated successfully, but these errors were encountered: