New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix intergrating hyper-parameters #237
Conversation
Codecov Report
@@ Coverage Diff @@
## master #237 +/- ##
==========================================
- Coverage 89.4% 89.09% -0.31%
==========================================
Files 114 114
Lines 3323 3339 +16
Branches 359 360 +1
==========================================
+ Hits 2971 2975 +4
- Misses 280 291 +11
- Partials 72 73 +1
Continue to review full report at Codecov.
|
@@ -3,7 +3,7 @@ | |||
import numpy as np | |||
|
|||
from emukit.core.acquisition import Acquisition | |||
from emukit.core.interfaces import IModel, IPriorHyperparameters, IDifferentiable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't it actually used? I mean, there is definitely a call of model.evaluate_with_gradients
below. I'd suggest to add this interface to model type hint
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have always been confused how we should do type hints when some interfaces are optional to implement. The type hint should say "Must implement IModel and optionally implement IDifferentiable". We usually do Union[IModel, IDifferentialbe]
which means "must implement both IModel and IDifferentialbe". You get warnings from the type checking system if you pass in a model that doesn't implement gradients with the current type hints.
|
||
@property | ||
def has_gradients(self) -> bool: | ||
"""Returns that this acquisition has gradients""" | ||
return isinstance(self.model, IDifferentiable) | ||
return self._has_gradients |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's cover this with a unit test
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
Fix model hyperparameters | ||
|
||
""" | ||
if self.gpy_model._fixes_ is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a way in gpy to do this without calling private stuff like _fixes_
and _trigger_params_changed
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll investigate, this is the way the other model wrapper does it though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Following our offline conversation, can you please create an issue to remind us to look deeper into this GPy magic? We may even consider contributing interface changes to GPy if necessary.
Integrating hyper-parameters didn't work with multi-output models or acquisitions without gradients.
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.