You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For debugging black box models, it would be nice to get shapley feature importance values as they relate to the loss of the model rather than the prediction. I've seen this implemeted by the original makers of SHAP, using TreeExplainer, with the assumption that features are independant. This Medium article goes into more depth about the implementation.
I'm wondering, would it be possible to obtain model agnostic shap loss values on dependant features, similar to how shapr does so for the predictions?
The text was updated successfully, but these errors were encountered:
I agree that this would be of interest. I think SAGE is the method you are looking for here. Take a look at this nice presentation: https://iancovert.com/blog/understanding-shap-sage/
As far as I know, the SAGE implementation ignores feature dependence, so it would be nice to implement it using a proper, conditioning scheme like we have in shapr. I certainly think it is doable, but we currently don't have it on the TODO-list.
Interesting! From what I understand, SAGE seems like it would be relatively easy to implement with the current code.. mainly altering the compute_vS functions to compute a loss. I may be free to work on this in a few weeks. Any comments on how you would want it organized in this repo?
For debugging black box models, it would be nice to get shapley feature importance values as they relate to the loss of the model rather than the prediction. I've seen this implemeted by the original makers of SHAP, using TreeExplainer, with the assumption that features are independant. This Medium article goes into more depth about the implementation.
I'm wondering, would it be possible to obtain model agnostic shap loss values on dependant features, similar to how shapr does so for the predictions?
The text was updated successfully, but these errors were encountered: