Skip to content

google-research/hyperbo

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 

HyperBO - Prior Discovery

A Jax/Flax codebase for the algorithm in HyperBO described in Pre-trained Gaussian processes for Bayesian optimization.

Research paper | Blog post | Colab Notebook | PD1 benchmark

Disclaimer: This is not an officially supported Google product.

Tutorial

Follow HyperBO's Colab Notebook or Jupyter Notebook.

Also see tests for a more comprehensive understanding of the usage.

Installation

We recommend using Python 3.7 or 3.9 for stability.

To install the latest development version inside a virtual environment, run

python3 -m venv env-pd
source env-pd/bin/activate
pip install --upgrade pip
pip install "git+https://github.com/google-research/hyperbo.git#egg=hyperbo"

PD1 benchmark

PD1 is a new hyperparameter tuning benchmark for optimizing deep learning models. To download the PD1 dataset, please copy and paste the following link to your browser's address bar.

http://storage.googleapis.com/gresearch/pint/pd1.tar.gz

See pd1/README.txt for more information. The data is licensed under the CC-BY 4.0 license.

If you'd like to use the evaluations at each training step, the relevant columns of the data frame are

'valid/ce_loss'
'train/ce_loss',
'train/error_rate',

etc. They will hold arrays aligned with the global_step column that indicates what training step the measurement was taken at.

See the "best_*" columns for the best measurement achieved over training.

GPax

GPax is a modular implementation of Gaussian processes used by HyperBO based on Tensorflow Probability with Jax backend.

Citation

Please cite our work if you would like to use the code.

@article{wang2023hyperbo,
  title={{Pre-trained Gaussian processes for Bayesian optimization}},
  author={Zi Wang and
          George E. Dahl and
          Kevin Swersky and
          Chansoo Lee and
          Zachary Nado and
          Justin Gilmer and
          Jasper Snoek and
          Zoubin Ghahramani},
  journal={arXiv preprint arXiv:2109.08215},
  year={2023}
}