You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a fully functional maximum likelihood II framework for Gaussian process regression. This should be able to perform gradient descent on the marginal likelihood (marignalised over the latent variables) in order to get an overfitting-resistant point-estimate of the model's parameters, such as kernel width and noise parameter of the likelihood.
This task is to write an example that does exactly this (and also add a graphical counterpart with plots) to verify that it works correctly and show people how to use GPR in shogun.
The example can be based on the current python regression example.
As always, all used parts should be unit-tested if this has not happened yet.
The text was updated successfully, but these errors were encountered:
We have a fully functional maximum likelihood II framework for Gaussian process regression. This should be able to perform gradient descent on the marginal likelihood (marignalised over the latent variables) in order to get an overfitting-resistant point-estimate of the model's parameters, such as kernel width and noise parameter of the likelihood.
This task is to write an example that does exactly this (and also add a graphical counterpart with plots) to verify that it works correctly and show people how to use GPR in shogun.
The example can be based on the current python regression example.
As always, all used parts should be unit-tested if this has not happened yet.
The text was updated successfully, but these errors were encountered: