Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get label output instead of Sentence Embedding #1721

Open
zaowad opened this issue Oct 11, 2022 · 1 comment
Open

Get label output instead of Sentence Embedding #1721

zaowad opened this issue Oct 11, 2022 · 1 comment

Comments

@zaowad
Copy link

zaowad commented Oct 11, 2022

If I use SoftmaxLoss in NLI example as shown in https://github.com/UKPLab/sentence-transformers/blob/master/examples/training/nli/training_nli.py , during training the base model Transformer+pooling is sent to https://github.com/UKPLab/sentence-transformers/blob/master/sentence_transformers/losses/SoftmaxLoss.py and it adds a nn.Linear layer with softmax activation on top of the base model . My evaluator is BinaryClassificationEvaluator . During evaluation instead of getting labeled output, the evaluator gets embedding for source-target sentence pairs and returns best threshold for various methods such as cosine-distance , dot similarity etc . Using that threshold for inference works fine in my case. But my question is, if there is any way to get labeled output from saved sentencetransformer(saved checkpoints during training) model .

@yuchengyue
Copy link

I am also trying to use softmax loss to make a two-category model, I think you can read the saved model and then get the probability of the label through the output in softmax loss.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants