Skip to content

Difference feature importance sklearn/shapash #363

Answered by ThomasBouche
Redflyo asked this question in Q&A
Discussion options

You must be logged in to vote

Hi,
Sorry for the delay in responding.
In Shapash, feature importance is computed from the sum of the local contributions (compiled by shap by default).
Feature importance is therefore the proportion of the global contributions of the feature.

So yes, the difference is normal between Shapash and Sklearn feature importance

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by Redflyo
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants