Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anyway you could update using this in python using XGBoost? #1

Closed
bbennett36 opened this issue Sep 18, 2017 · 3 comments
Closed

Anyway you could update using this in python using XGBoost? #1

bbennett36 opened this issue Sep 18, 2017 · 3 comments

Comments

@bbennett36
Copy link

Was trying to use the examples you provided but can't seem to get it working with the current version of XGBoost.

@slundberg
Copy link
Collaborator

slundberg commented Sep 20, 2017 via email

@bbennett36
Copy link
Author

I saw your update on XGBoost so you could close this if you wanted.

I did have another question, not really an issue or anything so I could email you if preferred to speak further.

Say I had 500 features in a model and 400 of them pushed the model score down or had little to no importance, is it safe to drop those features and re-train the model? Just wondering if there is any negative effects of doing this.

@slundberg
Copy link
Collaborator

The XGBoost PR was just merged so I'll close this issue now. Check out the example on the homepage for instructions on how to use it! (it assumes you have a version of master newer than 10/12/17)

As for the feature question, if they push the model output down then they are important to the model. If you mean they push the model accuracy down, then sure leave them out. The only way to know for sure is to actually do it and then see if the performance on a test dataset improves. Typically XGBoost is good at selecting only the features that matter (it is a stage-wise expansion, which has connections to L1 regularization), so I would be surprised if dropping them helped significantly unless the dataset is very small.

slundberg pushed a commit that referenced this issue Nov 30, 2019
add `return_variances` to `_PyTorchGradientExplainer.shap_values`
slundberg pushed a commit that referenced this issue Dec 10, 2019
slundberg pushed a commit that referenced this issue Sep 18, 2020
vivekchettiar referenced this issue in vivekchettiar/shap Oct 28, 2020
Merge SHAP to Local Fork
slundberg pushed a commit that referenced this issue Mar 23, 2022
Corrected Typos in README
slundberg pushed a commit that referenced this issue Jun 15, 2022
…ction

Added test for buffer strip update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants