Skip to content

Nesting configuration variable causes exception for TPE implementation #175

@willgroves

Description

@willgroves

My apologies if the behavior described below was not intended to be supported. The Bergstra2013 paper in the section "Sharing a configuration variable across choice branches" seems to indicate such nesting is possible. If this was not intended to be supported, please close this ticket. Variable argument nesting seems to work correctly for the random search but causes an exception for TPE (after the first 20 bootstrapping evaluations).

I am trying to describe a parameter space in which the value of parameters further down (b) in the tree are dependent on the values higher in the tree (a).

Constraints:
0 <= a <= 50
0 <= b <= 50
b >= a

There is a minimum working example showing this problem below. This code works correctly if algo=rand.suggest, but the code fails if algo=tpe.suggest.

Question: Is this not intended to be supported, but happens to work for random search?

Also, if this is not intended to be supported, what is the best way to implement such a parameter search? One approach would be to implement the constraint checking in the loss function. For configurations which do not satisfy the constraints, it could just report status STATUS_FAIL. Of course, if large portions of the parameter space return status STATUS_FAIL, then the number of evaluations would need to be much greater to account for the large number of failures.

-Will

MWE:

import hyperopt
from hyperopt import hp


mya = hp.quniform('a',0,50,1.0)
space = {'a': mya}
space.update({'b':hp.quniform('b',mya,50,1.0),})

def loss(d):
    val = 0
    val += abs(d.get('a')-25)
    val += abs(d.get('b')-30)
    #print "computed loss: ", val, "config:",d
    return {'loss':val,'status':hyperopt.STATUS_OK,'input':d}

trials = hyperopt.Trials()

print hyperopt.__version__
me = 10
r = hyperopt.fmin(loss,space=space,algo=hyperopt.tpe.suggest,max_evals=me,trials=trials)
print "after ",me,":", trials.average_best_error() ,r
me = 100
r = hyperopt.fmin(loss,space=space,algo=hyperopt.tpe.suggest,max_evals=me,trials=trials)
print "after ",me,":", trials.average_best_error() ,r

Program output:

##output from using rand.suggest
#0.0.3.dev
#after  10 : 4.0 {'1': 25.0, '2': 26.0}
#after  100 : 2.0 {'1': 26.0, '2': 29.0}

##output from tpe.suggest
#0.0.3.dev
#after  10 : 4.0 {'1': 25.0, '2': 26.0}
#Traceback (most recent call last):
#  File "/home/groves/workspace/PythonUtils/HyperoptExperiments/DependentGraphExample/depe#ndent_tree_sample_bug_mwe.py", line 31, in <module>
    r = hyperopt.fmin(loss,space=space,algo=hyperopt.tpe.suggest,max_evals=me,trials=trials)
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/fmin.py", line 340, in fmin
    verbose=verbose)
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/base.py", line 588, in fmin
    pass_expr_memo_ctrl=pass_expr_memo_ctrl)
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/fmin.py", line 351, in fmin
    rval.exhaust()
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/fmin.py", line 302, in exhaust
    self.run(self.max_evals - n_done, block_until_done=self.async)
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/fmin.py", line 257, in run
    new_trials = algo(new_ids, self.domain, trials)
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/tpe.py", line 902, in suggest
    print_node_on_error=False)
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/pyll/base.py", line 860, in rec_eval
    rval = scope._impls[node.name](*args, **kwargs)
  File "/home/groves/progs/hyperopt/hyperopt/hyperopt/tpe.py", line 432, in adaptive_parzen_normal
    srtd_mus[:prior_pos] = mus[order[:prior_pos]]
TypeError: only integer arrays with one element can be converted to an index

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions