Hi,
in sklearn.gaussian_process, the parameters alpha and length_scale of the given kernel are optimized given a specific range of values.
It can happen that the best parameter is outside this range.
The fact that the returned parameter matches the upper or lower bound of the range can be an indicator of such an issue, and therefore I was wondering if it could be of interest to raise a Warning in such a case to recommend the user to broaden the range of values and re-do a fit.
I could try to do it if you think it's worth it.
The text was updated successfully, but these errors were encountered:
You mean if the global optimal value is indeed the upper/lower bound ? I'd say the risk is low but in any case redoing a fit would tell if this was indeed the best parameter or not.
The problem I see is in degenerated cases where for some reason the optimal value is 0 or +inf, I don't know if such a case could happen with ill-conditioned matrices.
Hi,
in
sklearn.gaussian_process
, the parametersalpha
andlength_scale
of the given kernel are optimized given a specific range of values.It can happen that the best parameter is outside this range.
The fact that the returned parameter matches the upper or lower bound of the range can be an indicator of such an issue, and therefore I was wondering if it could be of interest to raise a Warning in such a case to recommend the user to broaden the range of values and re-do a fit.
I could try to do it if you think it's worth it.
The text was updated successfully, but these errors were encountered: