-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to get the exact SHAP importance values? #33
Comments
Oh maybe I should ask it in the git of shap |
When I use the model.estimator_ to calculate SHAP values, I found that some features corresponds to NaN values, maybe because I used the min_features_to_select option? How can I get the SHAP values used to select the features? |
Hi, shap-hypetune/shaphypetune/utils.py Lines 36 to 44 in 47316d3
if u support the project, don't forget to leave a star ;-) |
For example I have a BoostRFE object, and have called the model.fit() function of it, then how can I get the exact list of SHAP importance corresponding to each features selected by the model's estimator (just like what shap.plot.bar shows)? The values of the output from shap.Explainer seems not that straightforward to me, I have not found any info about the shap or meaning of those values.
Or can I get the SHAP value directly through the BoostRFE object?
The text was updated successfully, but these errors were encountered: