-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Interpreting Fine-tuned Bert model using LIME #409
Comments
The prediction function in |
Thanks for your reply. I figured it out, if anyone is interested in interpreting BERT using LIME. Here is the correct example :)
|
Hello Ramesh, Thank you for your answer. I have a question, what is the processor doing in the init method?
Thank you, J. |
H @jusugac1 , I was using that to process my dataset. But, here it is not needed. So i edited that now :) |
Hi Ramesh, I wonder how you handle an example with 2 sentences for the MRPC dataset. It has 2 inputs so the input for BERT should be Thank you very much! |
I'm interested too. Have you solved it?
|
what is "MODEL_CLASSES" in your code? |
How about applying LIME on deep model? is it possible |
Hello, have you solved this problem? I encountered the same problem. |
Hello, rameshjes ! I run your code and the following error occurred. ---> 12 config_class, model_class, tokenizer_class = MODEL_CLASSES[bert_model_class] how to solve this problem?Thank you very much. @rameshjes |
Hi, Did you manage to find any solution for this? |
@Elizabithi1-dev @Tingsie You guys must have solved it by now but MODEL_CLASSES will be (BertConfig, BertForSequenceClassification, BertTokenizer). |
Thanks for this amazing work. I am trying to interpret Fine-tuned BERT model using Transformer framework. It seems there is tokenization issue, when I try to use LIME with BERT. Here is the error that i am getting:
Here is my code:
I have checked this issue356, but still i cannot figure out my problem.
Any leads will be appreciated.
Thank you :)
The text was updated successfully, but these errors were encountered: