Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shap loss for regression problems #1784

Open
mschiwek opened this issue Jan 26, 2021 · 4 comments
Open

Shap loss for regression problems #1784

mschiwek opened this issue Jan 26, 2021 · 4 comments

Comments

@mschiwek
Copy link

Hi,

in the paper "Explainable AI for Trees: From Local Explanations to Global Understanding" are monitoring plots, which explain the shap loss values for squared errors. If I understand it correctly the hospital duration model solves a regression problem. As a result, I couldn't find out how these shap loss values are calculated. If I'm not mistaken, currently you can only calculate these values for classification problems using the shap library.

Have I misunderstood something or is there an easy way to calculate these values for regression problems?

@mschiwek
Copy link
Author

I did some research in the code. It seems like we are able to explain squared_loss. What confuses me is the parameter model_output, which has to be set to log_loss.

explainer = TreeExplainer(model, x, feature_dependence="independent", model_output="logloss")

By setting it to logloss I was able to explain my squared_error. What I do not undestand is why the model_output is set to logloss. Isn't logloss usually used for classification problems?

@alberto-bracci
Copy link

hi, I am having a similar problem. I am using a xgboost regressor with reg:logistic objective. I am using the same options as you, but shap values + expected values do not sum up to the squared error (by far) nor by the log loss computed as if it was a classification problem (by little).

What do you exactly mean by "explain my squared error" ?

@sebastianptz
Copy link

Hi, I am running into the same problem using the lightgbm regressor. Did you figure this out by now?

@CoteDave
Copy link

Hi, same here, is it possible to calculate the shap loss for regession problem ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants