Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: specifying distributions, from which individual parameters in search space should be sampled #702

Closed
imbalu007 opened this issue Oct 5, 2021 · 5 comments
Assignees
Labels
enhancement New feature or request wishlist Long-term wishlist feature requests

Comments

@imbalu007
Copy link

How to define the distribution from which a hyperparameter should be sampled?
In Hyperopt, this could be done using different functions in the API like, hp.normal, hp.lognormal, etc. (http://hyperopt.github.io/hyperopt/getting-started/search_spaces/)

@lena-kashtelyan
Copy link
Contributor

lena-kashtelyan commented Oct 5, 2021

Hi, @imbalu007! I don't think we currently allow defining parameters as distributions in general. You can, however, mark your range parameters as log_scale=True (API ref: https://ax.dev/api/core.html#ax.core.parameter.RangeParameter), which will ensure that the parameter’s random values are sampled from log space.

cc @danielrjiang (as current ModOpt oncall) and @Balandat and @eytan for additional thoughts

@lena-kashtelyan lena-kashtelyan self-assigned this Oct 5, 2021
@dme65
Copy link
Contributor

dme65 commented Oct 6, 2021

To add to Lena's answer, hyperopt is a bit different since it mainly uses TPEs where it is more natural to specify priors for the different tunable parameters. The most common model used by Ax is a Gaussian process (GP) with a Matern kernel (even though other models can be used as well) and it isn't as natural to specify the same types of priors for GPs. You technically can, but your posterior likely won't be Gaussian anymore. You can still achieve log-scale by setting log_scale=True on the search space parameter as @lena-kashtelyan suggested.

Out of curiosity, can you provide some more context on why you want to specify different priors for the tunable parameters?

@imbalu007
Copy link
Author

@dme65 , it is a requirement from one of our customers that uses Bayesian optimization to finetune their model.
Regarding support for priors, my current understanding is that Ax through its ModelBridge supports other models (ex: https://ax.dev/api/modelbridge.html#module-ax.modelbridge.random) that could take advantage of priors on search space parameters. So, for those models, is there any way that we can specify the priors?

@lena-kashtelyan
Copy link
Contributor

As @dme65 helped me understand, specifying priors on individual parameters is very natural for very natural for Tree Parzen Estimators (TPE), but not at all for Gaussian Processes (GP), at least not if we are using analytic mean and variance predictions as Ax does at the moment.

There are a few paths we could take to support user-specified priors for individual parameters:

  1. Use TPEs. Supporting them in Ax seems reasonable, but it's not on our roadmap at the moment.
  2. Encode priors on parameters in a prior for the optimum, but this is a pretty open research question since it isn't natural for GPs. Also not on our roadmap at the moment.
  3. There could be a path through adding a more complicated probabilistic model and use NUTS to do inference, but that is not something we are likely to work on.

In light of all of these, we'd say that if priors on individual parameter are crucial, it might be best to stick to Hyperopt or Optuna for now, as for us supporting this is a wishlist item with an uncertain status.

@lena-kashtelyan lena-kashtelyan changed the title Search space definition is limited Feature request: specifying distributions, from which individual parameters in search space should be sampled Oct 6, 2021
@lena-kashtelyan lena-kashtelyan added enhancement New feature or request wishlist Long-term wishlist feature requests labels Oct 6, 2021
@lena-kashtelyan
Copy link
Contributor

Merging this issue into our wishlist master task.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request wishlist Long-term wishlist feature requests
Projects
None yet
Development

No branches or pull requests

3 participants