Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions.
Install hyperopt from PyPI
pip install hyperopt
to run your first example
# define an objective function
def objective(args):
case, val = args
if case == 'case 1':
return val
else:
return val ** 2
# define a search space
from hyperopt import hp
space = hp.choice('a',
[
('case 1', 1 + hp.lognormal('c1', 0, 1)),
('case 2', hp.uniform('c2', -10, 10))
])
# minimize the objective over the space
from hyperopt import fmin, tpe, space_eval
best = fmin(objective, space, algo=tpe.suggest, max_evals=100)
print(best)
# -> {'a': 1, 'c2': 0.01420615366247227}
print(space_eval(space, best))
# -> ('case 2', 0.01420615366247227}
If you're a developer, clone this repository and install from source:
git clone https://github.com/hyperopt/hyperopt.git
cd hyperopt && python setup.py develop && pip install -e '.[MongoTrials, SparkTrials, ATPE, dev]'
Note that dev dependencies require black
, using python 3.6+ under the hood.
We recommend to use black
to format your code before submitting a PR. You can use it
with a pre-commit hook as follows:
pre-commit install
Then, once you commit ensure that git hooks are activated (Pycharm for example has the option to omit them). This will run black automatically on all files you modified, failing if there are any files requiring to be blacked.
Currently three algorithms are implemented in hyperopt:
- Random Search
- Tree of Parzen Estimators (TPE)
- Adaptive TPE
Hyperopt has been designed to accommodate Bayesian optimization algorithms based on Gaussian processes and regression trees, but these are not currently implemented.
All algorithms can be parallelized in two ways, using:
Hyperopt documentation can be found here, but is partly still hosted on the wiki. Here are some quick links to the most relevant pages:
See projects using hyperopt on the wiki.
If you use this software for research, plase cite the following paper:
Bergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. To appear in Proc. of the 30th International Conference on Machine Learning (ICML 2013).
This project has received support from
- National Science Foundation (IIS-0963668),
- Banting Postdoctoral Fellowship program,
- National Science and Engineering Research Council of Canada (NSERC),
- D-Wave Systems, Inc.