Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while replicating Keras-LSTM example #2466

Open
ishcha opened this issue Mar 30, 2022 · 2 comments
Open

Error while replicating Keras-LSTM example #2466

ishcha opened this issue Mar 30, 2022 · 2 comments
Labels
awaiting feedback Indicates that further information is required from the issue creator deep explainer Relating to DeepExplainer, tensorflow or pytorch

Comments

@ishcha
Copy link

ishcha commented Mar 30, 2022

I am trying to replicate Keras-LSTM DeepExplainer example. I am getting the following error when trying to compute the shap values:
This warning: keras is no longer supported, please use tf.keras instead.
Your TensorFlow version is newer than 2.4.0 and so graph support has been removed in eager mode and some static graphs may not be supported. See PR #1483 for discussion.
And this error:
`
TypeError Traceback (most recent call last)
in
1 import shap
2 explainer = shap.DeepExplainer(model, x_train[:100])
----> 3 shap_values = explainer.shap_values(x_test[:10])

~/miniconda3/envs/mtq/lib/python3.8/site-packages/shap/explainers/_deep/init.py in shap_values(self, X, ranked_outputs, output_rank_order, check_additivity)
122 were chosen as "top".
123 """
--> 124 return self.explainer.shap_values(X, ranked_outputs, output_rank_order, check_additivity=check_additivity)

~/miniconda3/envs/mtq/lib/python3.8/site-packages/shap/explainers/_deep/deep_tf.py in shap_values(self, X, ranked_outputs, output_rank_order, check_additivity)
306 # run attribution computation graph
307 feature_ind = model_output_ranks[j,i]
--> 308 sample_phis = self.run(self.phi_symbolic(feature_ind), self.model_inputs, joint_input)
309
310 # assign the attributions to the right part of the output arrays

~/miniconda3/envs/mtq/lib/python3.8/site-packages/shap/explainers/_deep/deep_tf.py in run(self, out, model_inputs, X)
363
364 return final_out
--> 365 return self.execute_with_overridden_gradients(anon)
366
367 def custom_grad(self, op, *grads):

~/miniconda3/envs/mtq/lib/python3.8/site-packages/shap/explainers/_deep/deep_tf.py in execute_with_overridden_gradients(self, f)
399 # define the computation graph for the attribution values using a custom gradient-like computation
400 try:
--> 401 out = f()
402 finally:
403 # reinstate the backpropagatable check

~/miniconda3/envs/mtq/lib/python3.8/site-packages/shap/explainers/_deep/deep_tf.py in anon()
356 shape = list(self.model_inputs[i].shape)
357 shape[0] = -1
--> 358 data = X[i].reshape(shape)
359 v = tf.constant(data, dtype=self.model_inputs[i].dtype)
360 inputs.append(v)

TypeError: 'NoneType' object cannot be interpreted as an integer
`

I have checked out the PR#1483, but couldn't find a relevant fix there. Please suggest on what tensorflow, keras, and shap versions are needed to successfully replicate the example.

@CloseChoice CloseChoice added awaiting feedback Indicates that further information is required from the issue creator deep explainer Relating to DeepExplainer, tensorflow or pytorch labels Dec 8, 2023
@CloseChoice
Copy link
Collaborator

Thanks for the report. Could you please provide a self-contained minimal reproducible example?

@CloseChoice
Copy link
Collaborator

Potentially related to #3419

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting feedback Indicates that further information is required from the issue creator deep explainer Relating to DeepExplainer, tensorflow or pytorch
Projects
None yet
Development

No branches or pull requests

2 participants