Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: LookupError: gradient registry has no entry for: shap_TensorListStack #3605

Open
2 of 4 tasks
AmishFaldu opened this issue Apr 9, 2024 · 2 comments
Open
2 of 4 tasks
Labels
bug Indicates an unexpected problem or unintended behaviour

Comments

@AmishFaldu
Copy link

Issue Description

When finding shap values for the deep learning model using Tensorflow, there is some operation lookup error for following operations:

  1. TensorListStack
  2. BatchMatMulV2 - Traceback is similar to TensorListStack
  3. While - Traceback is similar to TensorListStack
  4. TensorListFromTensor - Traceback is similar to TensorListStack

Can you please help me on this?

Thank you

Minimal Reproducible Example

import tensorflow as tf
import shap

inputs = Input(shape=(series_len, n_features,))
lstm_out = GRU(10, return_sequences=True, dropout=0.5, recurrent_dropout=0.3)(inputs)
hidden_states_t = Permute((2, 1))(lstm_out)
hidden_size = int(lstm_out.shape[2])
hidden_states_t = Reshape((hidden_size, series_len))(hidden_states_t)
score_first_part = Dense(series_len, use_bias=False)(hidden_states_t)
score_first_part_t = Permute((2, 1))(score_first_part)
lambda_out = Lambda(lambda x: x[:, :, -1], output_shape=(hidden_size, 1))(hidden_states_t)
score = dot([score_first_part_t, lambda_out], [2, 1])
output = Dense(1, activation='sigmoid', activity_regularizer=tf.keras.regularizers.l2(0.0001))(score)

model = tf.keras.Model(inputs=inputs, outputs=output)
model.build((32, series_len, n_features))
model.summary()

explainer = shap.DeepExplainer(
    model,
    X_train
)
shap_values = explainer.shap_values(X_test[:10])

Traceback

---------------------------------------------------------------------------
StagingError                              Traceback (most recent call last)
Cell In[11], line 13
      8 explainer = shap.DeepExplainer(
      9     model,
     10     X_train
     11 )
     12 # shap_values = explainer.shap_values(X_test)
---> 13 shap_values = explainer.shap_values(X_test[:10])

File ~/.venv/lib/python3.11/site-packages/shap/explainers/_deep/__init__.py:135, in DeepExplainer.shap_values(self, X, ranked_outputs, output_rank_order, check_additivity)
     91 def shap_values(self, X, ranked_outputs=None, output_rank_order='max', check_additivity=True):
     92     """Return approximate SHAP values for the model applied to the data given by X.
     93 
     94     Parameters
   (...)
    133 
    134     """
--> 135     return self.explainer.shap_values(X, ranked_outputs, output_rank_order, check_additivity=check_additivity)

File ~/.venv/lib/python3.11/site-packages/shap/explainers/_deep/deep_tf.py:310, in TFDeep.shap_values(self, X, ranked_outputs, output_rank_order, check_additivity)
    308 # run attribution computation graph
    309 feature_ind = model_output_ranks[j,i]
--> 310 sample_phis = self.run(self.phi_symbolic(feature_ind), self.model_inputs, joint_input)
    312 # assign the attributions to the right part of the output arrays
    313 for t in range(len(X)):

File ~/.venv/lib/python3.11/site-packages/shap/explainers/_deep/deep_tf.py:366, in TFDeep.run(self, out, model_inputs, X)
    363         tf_execute.record_gradient = tf_backprop.record_gradient
    365     return final_out
--> 366 return self.execute_with_overridden_gradients(anon)

File ~/.venv/lib/python3.11/site-packages/shap/explainers/_deep/deep_tf.py:401, in TFDeep.execute_with_overridden_gradients(self, f)
    399 # define the computation graph for the attribution values using a custom gradient-like computation
    400 try:
--> 401     out = f()
    402 finally:
    403     # reinstate the backpropagatable check
    404     if hasattr(tf_gradients_impl, "_IsBackpropagatable"):

File ~/.venv/lib/python3.11/site-packages/shap/explainers/_deep/deep_tf.py:359, in TFDeep.run.<locals>.anon()
    357     v = tf.constant(data, dtype=self.model_inputs[i].dtype)
    358     inputs.append(v)
--> 359 final_out = out(inputs)
    360 try:
    361     tf_execute.record_gradient = tf_backprop._record_gradient

File ~/.venv/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py:153, in filter_traceback.<locals>.error_handler(*args, **kwargs)
    151 except Exception as e:
    152   filtered_tb = _process_traceback_frames(e.__traceback__)
--> 153   raise e.with_traceback(filtered_tb) from None
    154 finally:
    155   del filtered_tb

File ~/.venv/lib/python3.11/site-packages/tensorflow/python/eager/polymorphic_function/autograph_util.py:52, in py_func_from_autograph.<locals>.autograph_handler(*args, **kwargs)
     50 except Exception as e:  # pylint:disable=broad-except
     51   if hasattr(e, "ag_error_metadata"):
---> 52     raise e.ag_error_metadata.to_exception(e)
     53   else:
     54     raise

StagingError: in user code:

    File "~/.venv/lib/python3.11/site-packages/shap/explainers/_deep/deep_tf.py", line 249, in grad_graph  *
        x_grad = tape.gradient(out, shap_rAnD)

    LookupError: gradient registry has no entry for: shap_TensorListStack

Expected Behavior

It should return the shap values

Bug report checklist

  • I have checked that this issue has not already been reported.
  • I have confirmed this bug exists on the latest release of shap.
  • I have confirmed this bug exists on the master branch of shap.
  • I'd be interested in making a PR to fix this bug

Installed Versions

0.45.0

@AmishFaldu AmishFaldu added the bug Indicates an unexpected problem or unintended behaviour label Apr 9, 2024
@KrisSreeramakavacham
Copy link

+1

1 similar comment
@songyongyu
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Indicates an unexpected problem or unintended behaviour
Projects
None yet
Development

No branches or pull requests

3 participants