New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow shap to take list or dict as input #3572
Comments
Thanks @JonathanBhimani-Burrows for opening the issue. We already support handing over lists if the model takes in multiple inputs (but only for deep, kernel and gradient explainers). I refer you to one of our test where we explicitly test this feature: https://github.com/shap/shap/blob/master/tests/explainers/test_deep.py#L748-L749. Edit: Currently there are no plans for supporting dictionaries here so you have to refer to the different explanations by list indexes. The reason for this is, that we believe it is best if the shap explainer can be called with the exactly same inputs the model can be called. |
Thanks for the reply |
Would it be possible for you to give a couple of examples so that we can find common patterns? |
Problem Description
I'm currently working on a model (torch), that takes a dict of tensors as input. Thing is, each tensor has a wildly different shape, so trying to use an np array as the input type won't work
Allowing for shap to take in a list or a dict would be very useful as inputs of different lengths would be easy to manage
Alternative Solutions
Given this might take some time to implement, is there a usable workaround that doesn't require trying to create a dataframe?
In theory, I could create a dataframe, where feature A from input dict has n columns based on it's shape. In the wrapper function, I could take the dataframe, combine the appropriate columns into a dict, and then infer from there, but this will get very laborious, especially as we need to pass in multiple samples at once
Additional Context
No response
Feature request checklist
The text was updated successfully, but these errors were encountered: