-
Notifications
You must be signed in to change notification settings - Fork 654
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Histogram loss #651
Histogram loss #651
Conversation
Is there a good default value we can use for |
From the Histogram loss paper:
and
so i think that 100 can be good since, in any case, finetuning on this parameter is not significative. |
Ok I set Can you add a description for Also does this loss work with anything other than CosineSimilarity? |
Need to adjust the code accordingly to the documentation
If now n_bins is by default equal to 100 the first control that at least one parameter should be set is useless. I propose to let both n_bins and delta be undefined by default, and if the user does not specify neither of them the default value of 100 for n_bins is used. I am going to commit this change as soon as possible.
I think that it can work with any distance given that the embeddings are normalized. It is a strict requirement that the values of the distances are bounded in [-1, 1], so i think that any distance fulfilling this request is acceptable. |
Thanks @domenicoMuscill0 ! |
I implemented the Histogram Loss and i used as comparison the rather complex code of this PyTorch implementation of the loss function. This is the officially recognized PyTorch implementation from the author of the paper themselves, but during testing i corrected a bug into this code and documented it. There is another implementation of the loss function, but it is not officially recognized. Should i add it to the tests anyway?