-
Notifications
You must be signed in to change notification settings - Fork 16
Constraint Handling
It is common to consider a constrained optimization task, where the search procedure has to respect the equality and inequality conditions, namely, h(x) = 0 & g(x) <= 0
. The optimization constraints are handled through imposing a dynamic penalty value to the acquisition function when being optimized to propose new candidate solutions. It is straightforward to implement your own equality or inequality constraints.
Input argument to the constraint function: the equality
h
and inequalityg
function work just like the objective functionobj_fun
. They could be defined to take as input a list or a dictionary and this behavior should be informed to the optimizer via setting theeval_type
argument of theBO
class.
For equality constraints, the function should only return 0
when the condition is satisfied and it is generally recommended to return a large value if the degree of constraint violations is higher, for instance, in the example below, we would like to restrict the search on a hyperplane x_1 + x_2 + ... + x_n = 1
:
def h(x):
return np.abs(np.sum(x) - 1)
If you have more than one constraint, please return a vector containing the value from each constraint from the constraint function.
For inequality constraints, the function should only return a non-positive value when the condition is satisfied, and similarly, it is recommended to return a large value if the degree of constraint violations is higher, for instance, in the example below, we would like to restrict the search on one half of the search space, separated by the condition x_1 + x_2 + ... + x_n - 1 <= 0
:
def g(x):
return np.sum(x) - 1
The constraint functions
h
andg
should be passed to the constructor ofBO
class (and its subclasses) or to thefmin
interface.