-
Notifications
You must be signed in to change notification settings - Fork 526
[DIA] Bayesian optimisation #1155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DIA] Bayesian optimisation #1155
Conversation
* Adding script bayesianOptimisation.dml which will contain the concrete implementation. * Adding BuiltingBazyesianOptimisationTest.java used for tests. * Adding bayesianOptimisationLM, bootstraping tests for the implementation. * Adding the script bayesianOptimisation.dml to the Builtin function, such that the parser understands the call to bayesianOptimisation().
files into the staging directory. * Moving new implementation to the correct directory. * Removing the uneeded bayesianOptimisationLM.dml file, which would call the bayesianOptimisation.dml script.
* Working in implementing a gaussian_kernel. * Calling aquisition/kernel functions.
* Creating intermediate test file bayesianOptimisationSinTest.dml * Moving acquisition/kernel function to the intermediate test file.
| kernel_gaus = function(Matrix[Double] X1, Matrix[Double] X2, Double variance) | ||
| return (Matrix[Double] result) | ||
| { | ||
| #TODO: What to do about multiple dimensional matrix? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi,
Just quick feedback, in DML we use two-space indentation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi and thanks for the Feedback, I applied it in the last commit.
|
Currently the tests are failing because of missing licenses, these can be simply added at the top of the files. |
* The algorithm now initialized with random samples. * With each iteration the algorithm will try to pick better hyperparameter sets. * There is still a bug which prevent accurate calculation of the means and variances used in the acquisition function. * Improved documentation. * Added further print messages for verbose mode. * Using the upper bound confidence instead of the probability for improvement as acquisition function.
|
@Baunsgaard Thanks, I've added them now. |
means/variances. Rewrote the updating of the model.
* The comparison test with a lm function using the default parameters, accidently used the TEST and not the TRAINING dataset. Making the lm using the searched parameters losing always. * Fixing license headers.
| @@ -1,3 +1,22 @@ | |||
| /* | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please reformat the test file. Add some test cases for different parameters of the optimizer such as number of iterations, minimize, and ExecType (CP or SPARK).
* Kernel function now are element wise instead of vector vise. The previous implementation would hand some complexity for the user to handle, when choosing/designing the kernel function. * Fixing a offset error of the idxScoreEntry. * Acquisition function would receive the variance instead of the standard deviation. * Remove some debugging prints. * Logs printed in debug mode now show up to 10 decimals. * Added a test for maximisation.
|
LGTM, |
Thanks, I didn't see your reply in time. |
This Pull Request implement bayesian optimisation for hyperparameters.