Skip to content

zhangjiekui/advisor

 
 

Repository files navigation

Advisor

Introduction

Advisor is the hyper parameters tuning system for black box optimization.

It is the open-source implementation of Google Vizier with the features.

  • Get suggestions from API, WEB or CLI
  • Support abstractions of Study and Trial
  • Included search and early stop algorithms
  • Recommend parameters with trained model
  • Same programming interfaces as Google Vizier

Algorithms

  • Random Search Algorithm
  • 2x Random Search Algorithm
  • Grid Search Algorithm
  • Baysian Optimization
  • Gaussian Process Bandit
  • Batched Gaussian Process Bandits
  • SMAC Algorithm
  • CMA-ES Algorithm
  • No Early Stop Algorithm
  • Early Stop First Trial Algorithm
  • Early Stop Descending Algorithm
  • Performance Curve Stop Algorithm
  • Median Stop Algorithm

Usage

Advisor Server

Run the advisor server.

pip install -r ./requirements.txt

./manage.py runserver 0.0.0.0:8000

Open http://127.0.0.1:8000 in the browser.

Docker Server

You can run the server with docker as well.

docker run -d -p 8000:8000 tobegit3hub/advisor

Advisor Client

Install with pip.

pip install advisor_client

Run with Python SDK.

client = AdvisorClient()

# Create the study
study_configuration = {
        "goal": "MAXIMIZE",
        "maxTrials": 5,
        "maxParallelTrials": 1,
        "params": [
                {
                        "parameterName": "hidden1",
                        "type": "INTEGER",
                        "minValue": 40,
                        "maxValue": 400,
                        "scallingType": "LINEAR"
                }
        ]
}
study = client.create_study("Study", study_configuration)

# Get suggested trials
trials = client.get_suggestions(study, 3)

# Complete the trial
client.complete_trial(trial, trial_metrics)

Run with command-line tool.

advisor study list

advisor trial list --study_id 1

Please checkout examples for more usage.

Concepts

Study configuration describe the search space of parameters. It supports four types and here is the example.

{
  "goal": "MAXIMIZE",
  "randomInitTrials": 1,
  "maxTrials": 5,
  "maxParallelTrials": 1,
  "params": [
    {
      "parameterName": "hidden1",
      "type": "INTEGER",
      "minValue": 1,
      "maxValue": 10,
      "scallingType": "LINEAR"
    },
    {
      "parameterName": "learning_rate",
      "type": "DOUBLE",
      "minValue": 0.01,
      "maxValue": 0.5,
      "scallingType": "LINEAR"
    },
    {
      "parameterName": "hidden2",
      "type": "DISCRETE",
      "minValue": 0,
      "maxValue": 0,
      "feasiblePoints": "1.5, -1.5, 2.5, 4.5",
      "scallingType": "LINEAR"
    },
    {
      "parameterName": "optimizer",
      "type": "CATEGORICAL",
      "minValue": 0,
      "maxValue": 0,
      "feasiblePoints": "sgd, adagrad, adam, ftrl",
      "scallingType": "LINEAR"
    },
    {
      "parameterName": "batch_normalization",
      "type": "CATEGORICAL",
      "minValue": 0,
      "maxValue": 0,
      "feasiblePoints": "true, false",
      "scallingType": "LINEAR"
    }
  ]
}

Visualization

You can visualize one-dimentation Bayesian Optimization with the notebooks in visualization.

Screenshots

List all the studies and create/delete the studies easily.

study_list.png

List the detail of study and all the related trials.

study_detail.png

List all the trials and create/delete the trials easily.

trial_list.png

List the detail of trial and all the related metrics.

trial_detail.png

About

Open-source implementation of Google Vizier for hyper parameters tuning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 89.9%
  • Python 6.4%
  • HTML 2.1%
  • JavaScript 1.1%
  • CSS 0.4%
  • Ruby 0.1%