-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds reranker example #58
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we give a slightly more meaningful example to show that it indeed works? And should we move this to docs/
Regarding exa
Regarding a meaningful example, sure! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
I am trying to use monobert instead of T5. Here is the code import torch |
Hi Fatima, Could you try |
I tried I am getting the same error shown below: in () 1 frames /content/gdrive/My Drive/Reranking-pygaggle/pygaggle/pygaggle/rerank/transformer.py in rerank(self, query, texts) ValueError: too many values to unpack (expected 1) Not sure if the tokenizer is the issue. |
The below code worked for me |
Great, thanks, Fatima! I've created a pull request that exemplifies how to use the BERT reranker: #59 |
Thanks a lot for the code. |
No description provided.