New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ability to tune the underlying bayes_opt params #189
Comments
Indeed, the current interface of on optimizer is not very flexible in this regard. nevergrad/nevergrad/optimization/optimizerlib.py Lines 1044 to 1067 in d5bda38
In this class, we could add close to anything (though the more we add, the more likely there will be new issues so I try to be cautious about it). Is there anything in particular you would like to have access to? |
I would like to have access to the utility function args (kind, kappa and xi) ... and for the GP, I think the alpha parameter will suffice. |
another (maybe related) issue is the seed of the BayesianOptimization object ... as I noticed that it's random but I think it should be reproducible / controllable |
The results of BO can be repeatedly reproduced inside Nevergrad by setting the seed to numpy random as such:
Check out previously closed issue #4 However there are some reported issues with the inability to reproduce BO results across different systems, you might want to take a look at this: |
@lazyoracle I think you are missing this :
|
@robert1826 Won't setting a fixed seed in |
@robert1826, as @lazyoracle mentioned, this line is precisely what makes seeding before initialization of the optimizer work. They are unit tests that make sure the results are reproducible (though I did notice slight differences from a computer to another one, which seems consistent with the issue mentioned by @lazyoracle) For the future, I'm still considering changing how seeding works in |
Concerning the parameters, I'll try to add them next week. |
Above is the diff to provide access to more parameters, @robert1826 would that work for you? Can you give more details to what was useful for you and why? This is just for my own culture, I am not yet very familiar with BO. |
This looks perfect to me and it's exactly what I asked for .... but for completeness please add another param for the 'init' budget .. instead of just setting it as the sqrt(budget) here: nevergrad/nevergrad/optimization/optimizerlib.py Line 1004 in b8fdfa7
|
Ok, this is now changed as well. I am a bit worried that this will create a lot of border cases though, but nevermind for now. On the longer term though, this may mean that the interface will evolve. I see a couple of things that will probably change:
In a nutshell, don't be surprise if you see some changes (or don't hesitate to ask for more changes, or propose some), this is still under development (sorry for the bother...) |
I'll merge now, but am unable to comprehensively test that all the parameters work as intended. Hopefully all should go well but let me know if you see anything weird ;) |
How well does this integrate with the family/factory of optimizers plan you were mentioning? Do you plan to have a |
My logic is that the proposed changes are just a way to 'expose' the params of bayes_opt to the user of nevergrad ... so any border cases in this change is probably handled by the bayes_opt package ... anyway I'm an active user and I'll try to test and fix bugs if I find some. |
@robert1826 some of the initialization part is not strictly from |
Some problems require tuning the underlying bayes_opt params such as the utility function being used or even the underlying gp params ... it seems that there is no way to change them using nevergrad
The text was updated successfully, but these errors were encountered: