New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPR not optimizing #12444
Comments
I think the docs are a bit ambiguous, but I don't think we have any
motivation to change the behaviour. Please feel free to submit a pull
request clarifying the documentation.
|
I think having a self-tuning kernel by default is a nice feature and matches the "spirit" of the docstring. It might be considered a bug that it does not do so by default. How many tests in the existing test suite would fail if we changed that? |
Hi,
I’m sorry- should I be responding to this?
Thanks,
Kelsey
On Oct 2, 2020, at 9:07 AM, Olivier Grisel <notifications@github.com> wrote:
I think the docs are a bit ambiguous, but I don't think we have any
motivation to change the behaviour. Please feel free to submit a pull
request clarifying the documentation.
I think having a self-tuning kernel by default is a nice feature and matches the "spirit" of the docstring. It might be considered a bug that it does not do so by default.
How many tests in the existing test suite would fail if we changed that?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub<#12444 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AHHRRHLT5SR6ENVZLUCS3UDSIXF7ZANCNFSM4F6ZC3DA>.
|
@ogrisel I agree that having the kernel adapt was my expected behavior when using this. I removed the "fixed" bounds from the default kernel and ran the test suite and it seems like no tests are breaking and the doc examples pass (unless I'm not testing another affected module):
However, I did notice that there's this test that would probably have to change if we decide to change the default kernel behavior: scikit-learn/sklearn/gaussian_process/tests/test_gpr.py Lines 438 to 451 in 2538489
P.S. sorry for bumping such an old issue @klinnell, I just opened a PR at #18518 to try to address this. it seems like you're not the only one who expected a self-adapting kernel though! edit: looks like there are quite a few tests that do fail in
most of them seem related to |
Description
When using GPR, it currently does NOT optimize the hyperparameters when kernel is set to "None" as it says it is should. (GaussianProcessRegressor from gpr.py)
Steps/Code to Reproduce
The code below should produce some GPR interpolation, but if you change the y values you can notice that the kernel doesn't change. This is because the optimization step is not being performed in grp.py.
What I did to fix this:
The commented code is what WAS there (line 173-175 of gpr.py) and the uncommented code is what I replaced it with. declaring the bounds as "fixed" overrides the ability to optimize.
The text was updated successfully, but these errors were encountered: