-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there any tutorial on how to apply Lime to BERT? #356
Comments
Thank you very much for your reply. It gave me an insight on how to make an appropriate function to deal with BERT's predictor. For those interested on the topic, I'll share my function below:
|
could you please provide us more insight about your variables? |
Thank you @marcotcr and @ThiagoSousa! Your comments were super helpful in creating a function to use LIME with a fine-tuned RoBERTa model from the simpletransformers library. For those interested, see below a simple example:
|
Hi! I am trying to use BERT tokenizer, but I am getting error - ValueError: Tokenization produced tokens that do not belong in string!`
For string - tokenizer.tokenize returns - ```
|
I am getting the below error when i run the above code #356 (comment)
Not sure what is wrong |
Greetings, I am looking to apply a LIME explainer to a fine-tuned BERT-model with a linear output layer. My training pipeline is vanilla, I am just stuck on integrating my model into the LIME explainer. Training form of data in my training pipeline: List of sentences, each mapping onto a numeric value. The idea is to use LIME to explain a BERT regression model, and the above mentioned approaches have not worked for me. If someone has solved this problem before or has an idea how, I'd be thankful if you let me know. |
exp = explainer.explain_instance(STR, predict_probab, num_features=10, num_samples = 1000) |
If anyone who wants to perform a multi-class classification ends up on this thread: the code attached below would save you some time.
Source: Lime Documentation |
I have a fine-tuned BERT model for binary classification. Is it possible to apply Lime to the model for explanations on BERT's predictions?
Thank you!
The text was updated successfully, but these errors were encountered: