-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SHAP Explainer throws an error while using transformers model for abstractive summarization #3442
Comments
I can reproduce your example on master but do not think that this is really a shap issue here. The following lines already result in similar errors: >>> model([LONG_ARTICLE])
AttributeError: 'list' object has no attribute 'size'
>>> model(LONG_ARTICLE)
AttributeError: 'str' object has no attribute 'size' Can you somehow get this to run so that the model actually returns something? |
I have tried but it didn't work. |
Okay, this indicates that we do not really have a shap error here since the model is not running at all. |
The hugging face model works well and generates the output summary.
It also throws an error below. AttributeError: 'numpy.ndarray' object has no attribute 'new_zeros' Below is the full error trace.
I suppose there can be a version mismatch issue. |
Can you show me how to generate the output with the model you are using? That's what I tried above but that failed. |
Following is a piece of code to generate the output using longt5.
For the second model bigbird pegasus the output can be generated using the code below.
|
Hey I encountered a similar issue. I think the issue is SHAP passed in a numpy array instead of tensor object when calling transformers API. |
@Yangliule Did you find any solution.? |
Hey @SVC04, I solved the issue by reverting transformers version and pytorch version. Also, make sure you are using AutoModelForSeq2SeqLM. |
@Yangliule could you please show the versions that are working? |
Hey @CloseChoice , I'm using transformers 4.12.0 and torch 1.10. Basically I reverted back to when the text SHAP feature was added. |
Issue Description
Hello,
I am trying to generate an explanation of abstractive text summarization output for a long piece of text input. I have been using various transformers models e.g. Bigbird-Pegasus, LongT5, Lonformer Encoder Decoder etc. But none of it worked and throws different errors. Following is the piece of code I used.
I get the TypeError. 'int object is not callable'.
Minimal Reproducible Example
Expected Behavior
No response
Bug report checklist
Installed Versions
Latest version used from github repo.
The text was updated successfully, but these errors were encountered: