Skip to content
Distributed Asynchronous Hyperparameter Optimization in Python
Python Jupyter Notebook Shell
Branch: master
Clone or download
vladkar and maxpumperla Timeout parameter to limit time (in seconds) of the search process (f…
…min) (#533)

* Add max_time parameter and its considering in cycle

* Add test for max_time parameter

* Add simple timeout param

* Add test for timeout param

* Add timeout param to spark fmin
Refactor timeout validation (logic is moved to base class)

* Timer import refactor

* Fix lost import

* Refactor timer import

* Correct spark timeout declaration via fmin

* Add test for spark timeout declaration via fmin

* Apply black to changed files

* Update param description style

* Add invalid timeout test

* Change string format style
Latest commit 11632b2 Dec 12, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
docs hyperopt facelift with black (#556) Dec 1, 2019
hyperopt Timeout parameter to limit time (in seconds) of the search process (f… Dec 12, 2019
recipes adding partial sampling notebook May 23, 2013
scripts linting for mongo worker script Nov 13, 2016
.gitignore v0.2.2 (#555) Oct 29, 2019
.gitmodules FIX: remove pyll submodule Jan 8, 2013
.pre-commit-config.yaml hyperopt facelift with black (#556) Dec 1, 2019
.travis.yml
LICENSE.txt ENH: added license file Jan 9, 2013
README.md hyperopt facelift with black (#556) Dec 1, 2019
RELEASE.txt added RELEASE.txt note about annotated tags Aug 8, 2013
distribute_setup.py hyperopt facelift with black (#556) Dec 1, 2019
download_spark_dependencies.sh Add SparkTrials (#509) Oct 1, 2019
run_tests.sh Add SparkTrials (#509) Oct 1, 2019
setup.py hyperopt facelift with black (#556) Dec 1, 2019
unit-test.md Add SparkTrials (#509) Oct 1, 2019

README.md

Hyperopt: Distributed Hyperparameter Optimization

Build Status PyPI version Anaconda-Server Badge

Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions.

Getting started

Install hyperopt from PyPI

pip install hyperopt

to run your first example

# define an objective function
def objective(args):
    case, val = args
    if case == 'case 1':
        return val
    else:
        return val ** 2

# define a search space
from hyperopt import hp
space = hp.choice('a',
    [
        ('case 1', 1 + hp.lognormal('c1', 0, 1)),
        ('case 2', hp.uniform('c2', -10, 10))
    ])

# minimize the objective over the space
from hyperopt import fmin, tpe, space_eval
best = fmin(objective, space, algo=tpe.suggest, max_evals=100)

print(best)
# -> {'a': 1, 'c2': 0.01420615366247227}
print(space_eval(space, best))
# -> ('case 2', 0.01420615366247227}

If you're a developer, clone this repository and install from source:

git clone https://github.com/hyperopt/hyperopt.git
cd hyperopt && python setup.py develop &&  pip install -e '.[MongoTrials, SparkTrials, ATPE, dev]'

Note that dev dependencies require black, using python 3.6+ under the hood.

We recommend to use black to format your code before submitting a PR. You can use it with a pre-commit hook as follows:

pre-commit install

Then, once you commit ensure that git hooks are activated (Pycharm for example has the option to omit them). This will run black automatically on all files you modified, failing if there are any files requiring to be blacked.

Algorithms

Currently three algorithms are implemented in hyperopt:

Hyperopt has been designed to accommodate Bayesian optimization algorithms based on Gaussian processes and regression trees, but these are not currently implemented.

All algorithms can be parallelized in two ways, using:

Documentation

Hyperopt documentation can be found here, but is partly still hosted on the wiki. Here are some quick links to the most relevant pages:

Examples

See projects using hyperopt on the wiki.

Announcements mailing list

Announcments

Discussion mailing list

Discussion

Cite

If you use this software for research, plase cite the following paper:

Bergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. To appear in Proc. of the 30th International Conference on Machine Learning (ICML 2013).

Thanks

This project has received support from

  • National Science Foundation (IIS-0963668),
  • Banting Postdoctoral Fellowship program,
  • National Science and Engineering Research Council of Canada (NSERC),
  • D-Wave Systems, Inc.
You can’t perform that action at this time.