You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, approxposterior selects GP hyperparameters by optimizing the marginal loglikelihood. This can potentially lead to overfitting, so I should implemented the ability for users to use K-fold cross validation to optimize GP hyperparameters. This can get tricky as the dimensionality grows, however, so care should be taken when determined why hyperparameters to try during the cross-validation.
The text was updated successfully, but these errors were encountered:
Implemented on the dev branch. The user can now set a new parameter, gpCV, to any integer number to perform gpCV-fold cross-validation to select the best GP hyperparameters. In this case, we pick the GP hyperparameters, from the list of maximum likelihood solutions, that produces the lowest mean squared error during cross-validation.
Currently, approxposterior selects GP hyperparameters by optimizing the marginal loglikelihood. This can potentially lead to overfitting, so I should implemented the ability for users to use K-fold cross validation to optimize GP hyperparameters. This can get tricky as the dimensionality grows, however, so care should be taken when determined why hyperparameters to try during the cross-validation.
The text was updated successfully, but these errors were encountered: