-
Notifications
You must be signed in to change notification settings - Fork 435
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixed broadcasting rules for gpflow.models.model.predict_y, partially resolves #1461. #1597
base: develop
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -18,10 +18,10 @@ | |
from .. import logdensities | ||
from ..base import Parameter | ||
from ..utilities import positive | ||
from ..utilities.ops import eye | ||
from .base import ScalarLikelihood | ||
from .utils import inv_probit | ||
|
||
|
||
class Gaussian(ScalarLikelihood): | ||
r""" | ||
The Gaussian likelihood is appropriate where uncertainties associated with | ||
|
@@ -61,7 +61,11 @@ def _conditional_variance(self, F): | |
return tf.fill(tf.shape(F), tf.squeeze(self.variance)) | ||
|
||
def _predict_mean_and_var(self, Fmu, Fvar): | ||
return tf.identity(Fmu), Fvar + self.variance | ||
rank = tf.rank(Fvar).numpy() | ||
if rank == 2: | ||
return tf.identity(Fmu), Fvar + self.variance | ||
else: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we, please, have an |
||
return tf.identity(Fmu), Fvar + eye(Fvar.shape[-1], self.variance) | ||
|
||
def _predict_log_density(self, Fmu, Fvar, Y): | ||
return tf.reduce_sum(logdensities.gaussian(Y, Fmu, Fvar + self.variance), axis=-1) | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -211,14 +211,14 @@ def predict_y( | |
""" | ||
Compute the mean and variance of the held-out data at the input points. | ||
""" | ||
if full_cov or full_output_cov: | ||
# See https://github.com/GPflow/GPflow/issues/1461 | ||
raise NotImplementedError( | ||
"The predict_y method currently supports only the argument values full_cov=False and full_output_cov=False" | ||
) | ||
|
||
f_mean, f_var = self.predict_f(Xnew, full_cov=full_cov, full_output_cov=full_output_cov) | ||
return self.likelihood.predict_mean_and_var(f_mean, f_var) | ||
|
||
if full_cov and full_output_cov: | ||
f_var_mat = tf.reshape(f_var, [1, f_var.shape[0]*f_var.shape[1], f_var.shape[2]*f_var.shape[3]]) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Would be nice if you could add expected shapes of tensors at the end of the line, for example
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. You probably want to use |
||
f_mean_pred, f_var_pred = self.likelihood.predict_mean_and_var(f_mean, f_var_mat) | ||
return f_mean_pred, tf.reshape(f_var_pred, f_var.shape) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Shape comments would be helpful here as well. |
||
else: | ||
return self.likelihood.predict_mean_and_var(f_mean, f_var) | ||
|
||
def predict_log_density( | ||
self, data: RegressionData, full_cov: bool = False, full_output_cov: bool = False | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know this is not related to your PR, but I think we can drop the
tf.identity
s here, as this is a no-op and looks like legacy code.