Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feat] Add 'precision' support to the EmbeddingSimilarityEvaluator #2559

Merged
merged 1 commit into from Mar 26, 2024

Conversation

tomaarsen
Copy link
Collaborator

Hello!

Pull Request overview

  • Add 'precision' support to the EmbeddingSimilarityEvaluator

Details

This PR adds simple support for evaluating quantized embeddings, by passing precision to the evaluator instance.

from sentence_transformers import SentenceTransformer
from sentence_transformers.evaluation import EmbeddingSimilarityEvaluator, SimilarityFunction
import datasets

model = SentenceTransformer("all-mpnet-base-v2")

stsb = datasets.load_dataset("mteb/stsbenchmark-sts", split="test")

for precision in ["float32", "uint8", "int8", "ubinary", "binary"]:
    evaluator = EmbeddingSimilarityEvaluator(
        stsb["sentence1"],
        stsb["sentence2"],
        [score / 5 for score in stsb["score"]],
        main_similarity=SimilarityFunction.COSINE,
        name="sts-test",
        precision=precision,
    )
    print(precision, evaluator(model))
float32 0.8342190421330611
uint8 0.8260094846238505
int8 0.8312754408857808
ubinary 0.8244338431442343
binary 0.8244338431442343
  • Tom Aarsen

@tomaarsen tomaarsen merged commit 1e35d8c into UKPLab:master Mar 26, 2024
9 checks passed
@tomaarsen tomaarsen deleted the quantization/evaluation branch March 26, 2024 07:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant