-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Anyway you could update using this in python using XGBoost? #1
Comments
Hey! Sorry about that, I pushed some updates to shap before the PR on
XGBoost got merged. I'll get the XGBoost pr finished up and then they will
work.
- Scott
…On Mon, Sep 18, 2017 at 1:06 PM Brennan Bennett ***@***.***> wrote:
Was trying to use the examples you provided but can't seem to get it
working with the current version of XGBoost.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#1>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADkTxf8bTIU_Qa-l6sQN1JNDE5LQHytjks5sjs1VgaJpZM4PbdJB>
.
|
I saw your update on XGBoost so you could close this if you wanted. I did have another question, not really an issue or anything so I could email you if preferred to speak further. Say I had 500 features in a model and 400 of them pushed the model score down or had little to no importance, is it safe to drop those features and re-train the model? Just wondering if there is any negative effects of doing this. |
The XGBoost PR was just merged so I'll close this issue now. Check out the example on the homepage for instructions on how to use it! (it assumes you have a version of master newer than 10/12/17) As for the feature question, if they push the model output down then they are important to the model. If you mean they push the model accuracy down, then sure leave them out. The only way to know for sure is to actually do it and then see if the performance on a test dataset improves. Typically XGBoost is good at selecting only the features that matter (it is a stage-wise expansion, which has connections to L1 regularization), so I would be surprised if dropping them helped significantly unless the dataset is very small. |
add `return_variances` to `_PyTorchGradientExplainer.shap_values`
…ction Added test for buffer strip update
Was trying to use the examples you provided but can't seem to get it working with the current version of XGBoost.
The text was updated successfully, but these errors were encountered: