Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for more parameters in the Differentially Private models #524

Open
sadsquirrel369 opened this issue Mar 31, 2024 · 1 comment
Open

Comments

@sadsquirrel369
Copy link

Does the Deferentially Private models support monotonous constraints and other objective functions, given that they inherited from the same base EBM model?

@paulbkoch
Copy link
Collaborator

Hi @sadsquirrel369 --

Regarding alternative objectives. I'll let @Harsha-Nori (the primary author of the DP-EBM paper) comment on that in more depth, but my limited understanding is that the DP proof would need to be updated for each alternative objective.

For monotonicity, we currently support two types of monotone constraints. Please note that we currently recommend the use of post processed monotonicity when monotonicity is being used in the context of responsible AI. If monotone constraints are applied during fitting, the model will often be able to shift any monotone violations to other correlated features (see: #184 for more details). We only recommend monotone constraints during fitting when you are 100% sure the underlying generation function is fundamentally monotone, like in the case where you have a feature that comes from of a physical system, or for investigative purposes where you're curious to find out where a model would shift effect when monotone constraints are applied (you can do a model diff in this case).

The post processed model editing monotonicity should be fine to use with DP models since it operates on the public model after fitting. (https://interpret.ml/docs/python/api/ExplainableBoostingRegressor.html#interpret.glassbox.ExplainableBoostingRegressor.monotonize). Applying monotone constraints during fitting for DP models is currently not supported, but I think it would be possible to add this. @Harsha-Nori can correct me if I'm wrong, but I believe the noisy update is fully public information on this line:

noisy_update_tensor = -noisy_update_tensor

And the monotone constraints would be honored if the update was disallowed or adjusted when the update would otherwise contain a monotone violation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants