[Feature Request] Allow kwargs
to be passed to ExactMarginalLogLikelihood.forward()
#2516
Labels
kwargs
to be passed to ExactMarginalLogLikelihood.forward()
#2516
馃殌 Feature Request
I was wondering if it'd be possible to allow keyword arguments (
**kwargs
) to be passed on to GPyTorch'sExactMarginalLogLikelihood.forward()
and its internal call to the model's likelihood or if that'd cause any issues. The super class methodMarginalLogLikelihood.forward
already accepts**kwargs
.Motivation
Is your feature request related to a problem? Please describe.
Sometimes noise models or likelihood classes need additional keyword arguments to function. For example, in my case, I am trying to use
FixedNoiseGaussianLikelihood
, which uses aFixedGaussianNoise
model for its noise covariance matrix. To call that model on data outside the training data, one needs to pass anoise
keyword argument, which is only picked up after the other*params
. At the momentExactMarginalLogLikelihood
only accepts*params
and will reject any extra keyword argument.I am using GPyTorch for a Bayesian calibration problem following Kennedy and O'Hagan (2001), which involves partially known inputs. In this problem, we have real and simulated data, and the real data's inputs can be seen as simulation inputs with a missing value (i.e., the unknown simulator parameter that best approximates reality). To estimate hyper-parameters in that paper, they follow a 2-stage approach:
For part 2, one needs to simultaneously estimate the unknown input, so that the training inputs are not constant throughout the training. I'm calling
ExactMarginalLogLikelihood
on the predictive distribution of the model conditioned on the simulation data. The simulation-related hyper-parameters, estimated in the first stage, are kept fixed (with their gradients disabled). Real and simulated data have different levels of noise, which are known in my case. So I'd need to pass thenoise
value through the MLL call to the likelihood model, but at the moment that is not possible, throwing an error complaining of the unknown argument if I try it.References
Pitch
Describe the solution you'd like
The solution should be simple, just adding
**kwargs
to the call signature forExactMarginalLogLikelihood.forward()
and then passing them on to its internal call toself.likelihood
.Describe alternatives you've considered
I could call the likelihood function directly on the predictive posterior, but that does not calculate the prior probabilities of the GP hyper-parameters. There is apparently no method available to compute the hyper-parameters log prior probability, other than iterating through
.named_priors()
.Another option is to call
.set_train_data
with the full data for a given realisation of the unknown input and then call the MLL with the model in training mode. However, that'd require resetting the noise model and perhaps would discard internal cached variables with the simulation data covariance matrix, which should be constant throughout the second stage of training.Are you willing to open a pull request? (We LOVE contributions!!!)
Yes.
The text was updated successfully, but these errors were encountered: