Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning when optimized value for GP's kernel hits bound #12638

Closed
SylvainLan opened this issue Nov 21, 2018 · 4 comments
Closed

Warning when optimized value for GP's kernel hits bound #12638

SylvainLan opened this issue Nov 21, 2018 · 4 comments

Comments

@SylvainLan
Copy link
Contributor

Hi,
in sklearn.gaussian_process, the parameters alpha and length_scale of the given kernel are optimized given a specific range of values.

It can happen that the best parameter is outside this range.

The fact that the returned parameter matches the upper or lower bound of the range can be an indicator of such an issue, and therefore I was wondering if it could be of interest to raise a Warning in such a case to recommend the user to broaden the range of values and re-do a fit.

I could try to do it if you think it's worth it.

@jnothman
Copy link
Member

jnothman commented Nov 21, 2018 via email

@SylvainLan
Copy link
Contributor Author

You mean if the global optimal value is indeed the upper/lower bound ? I'd say the risk is low but in any case redoing a fit would tell if this was indeed the best parameter or not.

The problem I see is in degenerated cases where for some reason the optimal value is 0 or +inf, I don't know if such a case could happen with ill-conditioned matrices.

@jnothman
Copy link
Member

Submit a pull request and we will see what others think

@cmarmo
Copy link
Member

cmarmo commented Jul 20, 2020

fixed in #12673.

@cmarmo cmarmo closed this as completed Jul 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants