-
Notifications
You must be signed in to change notification settings - Fork 79
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
TPE solver and initial documentation
- Loading branch information
Showing
4 changed files
with
153 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
Tree-structured Parzen Estimator | ||
================================ | ||
|
||
.. include:: /global.rst | ||
|
||
This solver is implemented in |api-tpe|. It as available in |make_solver| as 'TPE'. | ||
|
||
The Tree-structured Parzen Estimator (TPE) is a sequential model-based optimization (SMBO) approach. | ||
SMBO methods sequentially construct models to approximate the performance of hyperparameters based on historical measurements, | ||
and then subsequently choose new hyperparameters to test based on this model. | ||
|
||
The TPE approach models :math:`P(x|y)` and :math:`P(y)` where x represents hyperparameters and y the associated quality score. | ||
:math:`P(x|y)` is modeled by transforming the generative process of hyperparameters, replacing the distributions of the configuration prior | ||
with non-parametric densities. In this solver, Optunity only supports uniform priors within given box constraints. | ||
For more exotic search spaces, please refer to [Hyperopt]_. This optimization approach is described in detail in [TPE2011]_ and [TPE2013]_. | ||
|
||
Optunity provides the TPE solver as is implemented in [Hyperopt]_. This solver is only available if Hyperopt is installed, which in turn requires NumPy. Both dependencies must be met to use this solver. | ||
|
||
.. [TPE2011] Bergstra, James S., et al. "Algorithms for hyper-parameter optimization." Advances in Neural Information Processing Systems. 2011. | ||
.. [TPE2013] Bergstra, James, Daniel Yamins, and David Cox. "Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures." Proceedings of The 30th International Conference on Machine Learning. 2013. | ||
.. [Hyperopt] http://jaberg.github.io/hyperopt/ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
Sobol sequences | ||
=============== | ||
|
||
.. include:: /global.rst | ||
|
||
TODO |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,120 @@ | ||
#! /usr/bin/env python | ||
|
||
# Copyright (c) 2014 KU Leuven, ESAT-STADIUS | ||
# All rights reserved. | ||
# | ||
# Redistribution and use in source and binary forms, with or without | ||
# modification, are permitted provided that the following conditions | ||
# are met: | ||
# | ||
# 1. Redistributions of source code must retain the above copyright | ||
# notice, this list of conditions and the following disclaimer. | ||
# | ||
# 2. Redistributions in binary form must reproduce the above copyright | ||
# notice, this list of conditions and the following disclaimer in the | ||
# documentation and/or other materials provided with the distribution. | ||
# | ||
# 3. Neither name of copyright holders nor the names of its contributors | ||
# may be used to endorse or promote products derived from this software | ||
# without specific prior written permission. | ||
# | ||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS | ||
# ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT | ||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR | ||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR | ||
# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, | ||
# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, | ||
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR | ||
# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF | ||
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING | ||
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS | ||
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. | ||
|
||
from .solver_registry import register_solver | ||
from .util import Solver, _copydoc | ||
|
||
import random | ||
|
||
_numpy_available = True | ||
try: | ||
import numpy | ||
except ImportError: | ||
_numpy_available = False | ||
|
||
_hyperopt_available = True | ||
try: | ||
import hyperopt | ||
except ImportError: | ||
_hyperopt_available = False | ||
|
||
class TPE(Solver): | ||
""" | ||
.. include:: /global.rst | ||
This solver implements the Tree-structured Parzen Estimator, as described in [TPE2011]_. | ||
This solver uses Hyperopt in the back-end and exposes the TPE estimator with uniform priors. | ||
Please refer to |tpe| for details about this algorithm. | ||
.. [TPE2011] Bergstra, James S., et al. "Algorithms for hyper-parameter optimization." Advances in Neural Information Processing Systems. 2011 | ||
""" | ||
|
||
def __init__(self, num_evals=100, seed=None, **kwargs): | ||
"""blah | ||
""" | ||
if not _hyperopt_available: | ||
raise ImportError('This solver requires Hyperopt but it is missing.') | ||
if not _numpy_available: | ||
raise ImportError('This solver requires NumPy but it is missing.') | ||
|
||
self._seed = seed | ||
self._bounds = kwargs | ||
self._num_evals = num_evals | ||
|
||
@staticmethod | ||
def suggest_from_box(num_evals, **kwargs): | ||
d = dict(kwargs) | ||
d['num_evals'] = num_evals | ||
return d | ||
|
||
@property | ||
def seed(self): | ||
return self._seed | ||
|
||
@property | ||
def bounds(self): | ||
return self._bounds | ||
|
||
@property | ||
def num_evals(self): | ||
return self._num_evals | ||
|
||
@_copydoc(Solver.optimize) | ||
def optimize(self, f, maximize=True, pmap=map): | ||
|
||
if maximize: | ||
def obj(args): | ||
kwargs = dict([(k, v) for k, v in zip(self.bounds.keys(), args)]) | ||
return -f(**kwargs) | ||
|
||
else: | ||
def obj(args): | ||
kwargs = dict([(k, v) for k, v in zip(self.bounds.keys(), args)]) | ||
return f(**kwargs) | ||
|
||
def algo(*args, **kwargs): | ||
seed = self.seed if self.seed else random.randint(0, 9999999999) | ||
return hyperopt.tpe.suggest(*args, seed=seed, **kwargs) | ||
|
||
space = [hyperopt.hp.uniform(k, v[0], v[1]) for k, v in self.bounds.items()] | ||
best = hyperopt.fmin(obj, space=space, algo=algo, max_evals=self.num_evals) | ||
return best, None | ||
|
||
|
||
# TPE is a simple wrapper around Hyperopt's TPE solver | ||
if _hyperopt_available and _numpy_available: | ||
TPE = register_solver('TPE', 'Tree of Parzen estimators', | ||
['TPE: Tree of Parzen Estimators'] | ||
)(TPE) |