Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hierarchical search spaces #140

Closed
yonatanMedan opened this issue Aug 7, 2019 · 12 comments
Closed

Hierarchical search spaces #140

yonatanMedan opened this issue Aug 7, 2019 · 12 comments
Assignees
Labels
enhancement New feature or request in progress wishlist Long-term wishlist feature requests

Comments

@yonatanMedan
Copy link

Is there a way to define a hierarchy of parameters?
for example a parameter that chooses architecture, and each architecture has its own parameters.

example (pseudo code):

architecture = choise(["NeuralNetwork","xgdboost"])

if architecture=="NeuralNetwork":
     n_layers = choise(range(1,10,1))
     #more architecture releted params here.

else if  architecture=="xgdboost":
    max_depth =  choise(range(1,5,1))
     #more architecture releted params here.

@ldworkin
Copy link
Contributor

ldworkin commented Aug 7, 2019

Hi @yonatanMedan! Great question. We don't currently support this, but it's on our roadmap to support in the next few months. I'll let you know when it's ready!

@ldworkin ldworkin added the enhancement New feature or request label Aug 7, 2019
@riyadparvez
Copy link

Yes, this would be a great addition! I have a similar usecase - after hyperparameter optimization choose the right threshold for classification.

@Tandon-A
Copy link

Tandon-A commented Jun 14, 2020

This enhancement would be super helpful in my use case where I want to experiments with different learning rate schedulers, where the parameters used by the schedulers are different.

@lena-kashtelyan lena-kashtelyan changed the title [Question] way to define hierarchy of parameters? Hierarchical search spaces Aug 28, 2020
@LyzhinIvan
Copy link

Hi! Are there some estimates when this functionality will be available?

@ldworkin
Copy link
Contributor

Hi @LyzhinIvan ! Unfortunately, probably not in the immediate short-term. This has been deprioritized in favor of other efforts. However it's certainly still on our roadmap! cc @2timesjay

@lena-kashtelyan
Copy link
Contributor

We will now be tracking wishlist items / feature requests in a master issue for improved visibility: #566. Of course please feel free to still open new feature requests issues; we'll take care of thinking them through and adding them to the master issue.

@lena-kashtelyan
Copy link
Contributor

lena-kashtelyan commented Oct 27, 2021

This is in progress now, and it's already possible to run experiments over hierarchical search spaces via Developer and Service APIs (the functionality is part of main branch currently and will be added to the next stable release). There are some constraints: a hierarchical search space must be a valid connected tree with one root parameter and currently only Sobol is supported as we develop proper modeling support for this. But it is possible to run a basic optimization over a hierarchical search space like so:

from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.service.utils.report_utils import exp_to_df
from ax.utils.measurement.synthetic_functions import branin

ax_client = AxClient()
ax_client.create_experiment(
    parameters=[
        {
            "name": "model",
            "type": "choice",
            "values": ["Linear", "XGBoost"],
            "dependents": {
                "Linear": ["learning_rate", "l2_reg_weight"],
                "XGBoost": ["num_boost_rounds"],
            },
        },
        {
            "name": "learning_rate",
            "type": "range",
            "bounds": [0.001, 0.1],
            "log_scale": True,
        },
        {
            "name": "l2_reg_weight",
            "type": "range",
            "bounds": [0.00001, 0.001],
        },
        {
            "name": "num_boost_rounds",
            "type": "range",
            "bounds": [0, 15],
        },
    ],
    objectives={"objective": ObjectiveProperties(minimize=True)},
    # To force "Sobol" if BayesOpt does not work well (please post a repro into 
    # a GitHub issue to let us know, it will be great help in debugging this faster!)
    # choose_generation_strategy_kwargs={"no_bayesian_optimization": True},
)


def contrived_branin(parameterization):  # branin domain: x1 in [-5., 10.], x2 in [0., 15.]
    if parameterization.get("model") == "Linear":
        lr = parameterization.get("learning_rate")
        l2_reg = parameterization.get("l2_reg_weight")
        
        print(f"Computing Branin with x1={lr * 100}, x2={l2_reg * 1000} (`Linear` model case)")
        return branin(lr * 100, l2_reg * 1000)
    
    if parameterization.get("model") == "XGBoost":
        num_boost_rounds = parameterization.get("num_boost_rounds")
        
        print(f"Computing Branin with x1={num_boost_rounds-5}, x2={num_boost_rounds} (`XGBoost` model case)")
        return branin(num_boost_rounds-5, num_boost_rounds)
    
    raise NotImplementedError

for _ in range(20):
    params, trial_index = ax_client.get_next_trial()
    
    ax_client.complete_trial(
        trial_index=trial_index, 
        raw_data=contrived_branin(params)
    )

exp_to_df(ax_client.experiment)

We don't have a specific estimate over when our BayesOpt algorithms will support this functionality, but it should be within a few months.

cc @yonatanMedan, @LyzhinIvan, @Tandon-A, @riyadparvez

@lena-kashtelyan
Copy link
Contributor

Reopening this issue as it is now in-progress

@LyzhinIvan
Copy link

Great, thanks! I'm looking forward to supporting BayesOpt mode.

@lena-kashtelyan
Copy link
Contributor

lena-kashtelyan commented Jan 10, 2022

@yonatanMedan, @LyzhinIvan, @Tandon-A, @riyadparvez, BayesOpt mode is supported in alpha-mode now and currently works through search space flattening (so the Gaussian Process model is not aware of the hierarchical structure of the search space under the hood). cc @dme65 to say more about when BayesOpt over flattened search spaces is effective

If you try it, please let us know how it goes for you (ideally in this issue)! Updated version of my example above that should le you run BayesOpt:

from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.service.utils.report_utils import exp_to_df
from ax.utils.measurement.synthetic_functions import branin

ax_client = AxClient()
ax_client.create_experiment(
    parameters=[
        {
            "name": "model",
            "type": "choice",
            "values": ["Linear", "XGBoost"],
            "dependents": {
                "Linear": ["learning_rate", "l2_reg_weight"],
                "XGBoost": ["num_boost_rounds"],
            },
        },
        {
            "name": "learning_rate",
            "type": "range",
            "bounds": [0.001, 0.1],
            "log_scale": True,
        },
        {
            "name": "l2_reg_weight",
            "type": "range",
            "bounds": [0.00001, 0.001],
        },
        {
            "name": "num_boost_rounds",
            "type": "range",
            "bounds": [0, 15],
        },
    ],
    objectives={"objective": ObjectiveProperties(minimize=True)},
    # To force "Sobol" if BayesOpt does not work well (please post a repro into 
    # a GitHub issue to let us know, it will be great help in debugging this faster!)
    # choose_generation_strategy_kwargs={"no_bayesian_optimization": True},
)

@lena-kashtelyan
Copy link
Contributor

I'll close this issue as inactive, but the example above it functional for anyone who wants to try it!

@sgbaird
Copy link
Contributor

sgbaird commented Jul 1, 2022

Another use-case for hierarchical search spaces is in the physical sciences with multi-step processing / synthesis. For example, there are 4 pieces of equipment, A, B, C, and D, where A always comes first and D always comes last, and in the middle you can choose either B or C, where B and C have distinct equipment parameters. Another case is where you can omit the second processing step altogether.

B or C might look like a surface preparation step that involves two different types of surface preparation: plasma etching vs. chemical etching. I think often (at least in academia), multi-step/multi-path synthesis routes are reduced to single-path optimizations that operate largely independently from one another despite sharing common traits. I think featurization of complex synthesis routes is still an ongoing question. The examples that treat complex synthesis routes generally fall into the category of natural language processing (e.g. ULSA, Roborxn).

How does the flattening of the search space in the above example work? One option would be to add additional boolean variables that describe whether a given branch is active or not and/or setting the inactive parameters to particular values.

so the Gaussian Process model is not aware of the hierarchical structure of the search space under the hood

It sounds like adding variables that encode the hierarchy isn't the approach that's taken here; however, the inactive variables would still need to be assigned values, correct? How is this handled currently?

I wonder if there's some possibility of representing hierarchical search spaces as flattened search spaces with non-linear constraints. I'd need to give that one some more thought.

(not time-sensitive for me atm)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request in progress wishlist Long-term wishlist feature requests
Projects
None yet
Development

No branches or pull requests

8 participants