New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about setting specified parameters. #176
Comments
Hey @yjhong89 - great question. We do support certain forms of constraints on the search space, but they do not generally allow for the kind of flexibility that I think you want in this case, if I understand you correctly. You're really asking for an ad hoc, non programmatically generated, set of initial arms for your initial exploration round, which you will then run the bandit algorithm on top of. I've attached an example notebook that adapts the bandit tutorial to demonstrate how one can manually select the initial arms instead of using a full-factorial design (which will take combinations of all the arms in the search space) for the initial exploration trial / round, and then run the bandit algorithm. See the ipynb and html of notebook here: Can you let me know if this addresses your question? Longer term, we're working on improving our APIs to better support ad hoc search spaces. |
@kkashin - Thank you for your answer. I will try your solution and let you know. Thanks. --Yet another question-- I am looking for bayesian tutorial I tried in this way. (reference from https://github.com/facebook/Ax/blob/master/tutorials/gpei_hartmann_service.ipynb)
By using ax.log_trial_failure API, I can skip evaluations for those parameters not having real function values. Thank you. |
@yjhong89 if the unknown areas of your search space can be represented as linear constraints (the valid area is a convex linear polytope) then you can use ParameterConstraints. These are accepted as arguments to the Search Space. If your invalid areas can't be represented this way (if they're "holes" in the search space for instance) then your solution is pretty similar to any other options I could suggest. Creating your own RandomModel would be a reasonable but heavyweight option that would let you implement custom rejection sampling. |
@yjhong89, you can also do Bayesian optimization on search spaces that consist of To find which solution is best, may I ask a bit more about your use case? How many total valid combinations are in your search space? Are all of your parameters choice and how many are there? What percentage of the total number of combinations you know to be invalid? |
@2timesjay @lena-kashtelyan Thank you. |
@yjhong89, hi again! I am still somewhat confused about your setup –– please correct if my understanding below is wrong:
If this is the setup, then what is the expected use of BayesOpt here? Re: TPE, we do not use it, the surrogate model is always Gaussian process. |
@lena-kashtelyan, Thanks for your answer. You understand correctly. You are right. Thanks. |
@yjhong89, would you say I fully answered your question or is there anything else you would like me to clarify? |
@lena-kashtelyan |
Hello.
I am trying to use Ax for a problem which seems to be solved by a bandit optimization.
Here, I have a question about how I can set specified parameters (arms in Ax) for function evaluation.
For example, I have two parameters named 'x1' and 'x2' which have 'x1' in {1,2,3} and 'x2' in {1,2}, respectively.
I read this tutorial(https://ax.dev/docs/core.html) and I can set a search space through Choice parameters or Range parameters(int).
The problem is I have true evaluation function values for specific combination of 'x1' and 'x2'.
e.g) I have f(x1=1, x2=1) but I don't know f(x1=3, x2=2).
Therefore, I want to specify a search space in a way that makes it have parameter combinations I know real function values. (do not need search (x1=3, x2=2) from above examples)
How can I do that? Please help.
Thanks in advance.
The text was updated successfully, but these errors were encountered: