Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to compute fun and jacobian at the same time in nonlinear constraint? #12692

Closed
HiroIshida opened this issue Aug 10, 2020 · 2 comments

Comments

@HiroIshida
Copy link

According to the documentation, in evaluating objective function f( ), one can evaluate both the function f(x) and grad(f(x)). This can be done by setting boolean value of jac true.

jac{callable, ‘2-point’, ‘3-point’, ‘cs’, bool}, optional

Similarly, I want to compute both constraint function and its jacobian in evaluating nonlinear inequality constraints. However, according to the documentation, no such thing seems to be an option.

jac{callable, ‘2-point’, ‘3-point’, ‘cs’}, optional \\ no boolean! 

Is there any way to realize this?

@HiroIshida
Copy link
Author

Looking at the source code, I found the Jacobian seems to be computed right after the evaluation of the constraint function. Thus, the workaround would be setting global variable (or making class for that) and writing the code like below:

import numpy as np
from scipy.optimize import *

global jac_cache
jac_cache = None

def ineq_fun(x):
    fx = np.array([1 - x[0] - 2*x[1], 1 - x[0]**2 - x[1], 1 - x[0]**2 + x[1]])
    global jac_cache
    jac_cache = np.array([[-1.0, -2.0], [-2*x[0], -1.0], [-2*x[0], 1.0]])
    return fx

def ineq_jac_fun(x):
    global jac_cache
    if jac_cache is None:
        raise Exception
    return jac_cache

ineq_cons = {'type': 'ineq',
             'fun' : ineq_fun,
             'jac' : ineq_jac_fun}
eq_cons = {'type': 'eq',
           'fun' : lambda x: np.array([2*x[0] + x[1] - 1]),
           'jac' : lambda x: np.array([2.0, 1.0])}

bounds = Bounds([0, -0.5], [1.0, 2.0])
x0 = np.array([0.5, 0])
res = minimize(rosen, x0, method='SLSQP', jac=rosen_der,
               constraints=[eq_cons, ineq_cons], options={'ftol': 1e-9, 'disp': True},
               bounds=bounds)
print(res.x)

@HiroIshida
Copy link
Author

HiroIshida commented Aug 11, 2020

Rather than using global variable or custom class, I finally found using closure like below is more concise and easier to incorporate it into my existing codes:

def generate_ineq_funs():
    member = {'jac_cache': None}
    ineq_jac_fun = lambda x: member['jac_cache']

    def ineq_fun(x):
        fx = np.array([1 - x[0] - 2*x[1], 1 - x[0]**2 - x[1], 1 - x[0]**2 + x[1]])
        member['jac_cache'] = np.array([[-1.0, -2.0], [-2*x[0], -1.0], [-2*x[0], 1.0]])
        return fx

    return ineq_fun, ineq_jac_fun

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants