Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

set check_additivity=False for CausalForestDML #458

Merged
merged 4 commits into from
May 6, 2021
Merged

Conversation

heimengqi
Copy link
Contributor

No description provided.

Copy link
Collaborator

@kbattocchi kbattocchi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@vsyrgkanis vsyrgkanis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we only adding this flag in that call? If this is a problem about correlation of variables and our use of background masks, and not about trees in particular wouldn't this appear elsewhere?

Also we could in principle make this flag a public facing variable with default being false.

@heimengqi
Copy link
Contributor Author

Why are we only adding this flag in that call? If this is a problem about correlation of variables and our use of background masks, and not about trees in particular wouldn't this appear elsewhere?

Also we could in principle make this flag a public facing variable with default being false.

This is going to be updated. After digging into different explainers and shap doc examples, for tree explainer, we shouldn't input masker, and by default they will use the trained sample stored in estimator object, but for other explainer, masker is required. However the issue is when we call generic Explainer, we don't know what the exact explainer it will identify. So the current solution is I first use Explainer to identify the explainer type, if it's Tree, I will redefine an exact TreeExplainer without masker instead. This will fix the additivity exception and be consistent with the default use case for different explainers from shap doc.

However, there are some other issues around this (e.g. TreeExplainer with or without masker output different shape of base_values, for TreeExplainer without masker base_value doesn't equal to mean of cate anymore), I am looking into it now.

@heimengqi
Copy link
Contributor Author

Why are we only adding this flag in that call? If this is a problem about correlation of variables and our use of background masks, and not about trees in particular wouldn't this appear elsewhere?
Also we could in principle make this flag a public facing variable with default being false.

This is going to be updated. After digging into different explainers and shap doc examples, for tree explainer, we shouldn't input masker, and by default they will use the trained sample stored in estimator object, but for other explainer, masker is required. However the issue is when we call generic Explainer, we don't know what the exact explainer it will identify. So the current solution is I first use Explainer to identify the explainer type, if it's Tree, I will redefine an exact TreeExplainer without masker instead. This will fix the additivity exception and be consistent with the default use case for different explainers from shap doc.

However, there are some other issues around this (e.g. TreeExplainer with or without masker output different shape of base_values, for TreeExplainer without masker base_value doesn't equal to mean of cate anymore), I am looking into it now.

Take it back, we should still input masker for TreeExplainer and go with "interventional" feature pertubation, and just disable check additivity. Since we have access to training(background) dataset, "interventional" should be preferred and it gives causal explanation, and LinearExplainer use "interventional" approach as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants