Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate SMAC with benchmarking #195

Closed
pvk-developer opened this issue May 12, 2020 · 0 comments · Fixed by #200
Closed

Integrate SMAC with benchmarking #195

pvk-developer opened this issue May 12, 2020 · 0 comments · Fixed by #200
Assignees
Milestone

Comments

@pvk-developer
Copy link
Collaborator

pvk-developer commented May 12, 2020

Integrate SMAC: Sequential Model-based Algorithm Configuration with our benchmarking.

How to integrate SMAC3 tuners:

All SMAC3 tuners have the same API: Tuners are classes that are used by calling a single optimize method that returns the best hyperparameter configuration found. The user must create a Scenario object with the following arguments:

  • CS: ConfigSpace, the configuration of hyperparameters to be tuned.
  • run_obj: Quality or Runtime (interesting quality).
  • runcount_limit: number of iterations.
  • deterministic: if false the model will reevaluate the same points several times.

Once created this Scenario we can create the instance of the tuner that receives as arugments:

  • Scenario
  • tae_runner: the function we want to optimize.
  • rng: seed of randomness.
  • acquisition_function (optional): If it is a tuner based on a machine learning model, it can receive an acquisition function. EI or LogEI is used by default.

Having our tuner instance configured we can call the optimize method to return the best hyperparameter configuration.

Because the tuner celery is different from what our benchmark expects, we will need to create a wrapper function that:

  • Adjust the evaluation function so that it can receive a dictionary as input (we accept kwargs) and turn it into a minimization problem (smac works only for minimization)
  • Create the Scenario and then the optimizer instances.
  • Call the optimize method.
  • When finished, compute the score obtained by the returned configuration

SMAC3 tuners:

SMAC4HPO
Bayesian optimization using a Random Forest model of pyrfr. Self-implementation explained in the following paper. We want to implement this tuner as its main functionality is to find the optimal configuration for a machine learning algorithm. We want to use this algorithm for all kinds of challenges we have in our library.

HB4AC (Hyperband)
Uses Successive Halving for proposals. Self-implementation explained in the following paper. We are interested in this hyperparameter optimization algorithm.

ROAR (Random Online Aggressive Racing)
Select and test uniformly random parameter settings. Self-implementation explained in the [paper. We don't want to implement it because this algorithm is equivalent to our UniformTuner.

SMAC4AC (Configuration Algorithm)
Select parameter settings using a RandomForest from the pyrfr library. Self-implementation explained in the following paper . We don't want to implement it because it is proposed to optimize the execution time of different algorithms. We are more interested in a machine learning model hyperparameter optimization algorithm.

SMAC4BO
Bayesian optimization using a Gaussian Process model from the skopt library. Own implementation explained in the following paper. We are not interested in its integration as its implementation is aimed at tuning small values ​​from 0 to 1.0.

BOHB4HPO (Bayesian Optimization (BO) and Hyperband (HB))
Start using "successive halving" until you reach a number of configurations to be able to train a model (GaussainProcess) and start using it. Own implementation. The proposal for this model can be found in the following paper. We are not interested in its integration as it has the same deficit as SMAC4BO.

@pvk-developer pvk-developer added this to the 0.3.9 milestone May 12, 2020
@pvk-developer pvk-developer self-assigned this May 12, 2020
@pvk-developer pvk-developer removed this from the 0.3.9 milestone May 18, 2020
@pvk-developer pvk-developer added this to the 0.3.10 milestone May 29, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant