Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MAINT: optimize: make trust-constr accept constraint dict (#9043) #9112

Merged
merged 15 commits into from
Aug 25, 2018

Conversation

mdhaber
Copy link
Contributor

@mdhaber mdhaber commented Aug 4, 2018

Attempt to address #9043: this PR adds conversion from old- to new-style constraints for trust-constr and raises error when new-style constraints are used with other methods.
As far as I can tell, trust-constr already accepts old-style bounds and converts new- to old-style bounds for other methods.
Couldn't run tests on my computer due to SciPy install issues... lets see how this goes.

@pv
Copy link
Member

pv commented Aug 4, 2018

The other direction should also be added, old solvers accepting the "new" constraint format.
trust-constr should not behave differently from the other solvers.

@@ -585,6 +587,23 @@ def minimize(fun, x0, args=(), method=None, jac=None, hess=None,
if isinstance(bounds, Bounds):
bounds = new_bounds_to_old(bounds.lb, bounds.ub, x0.shape[0])

all_constraint_types = (NonlinearConstraint, LinearConstraint, dict)
Copy link
Member

@pv pv Aug 4, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should go to a separate helper function, maybe called
normalize_constraints_to_new_format, only the if method == 'trust-constr': (...) = normalize_constraints_to_new_format(...) left here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, the bounds would need to be normalized.

Copy link
Contributor Author

@mdhaber mdhaber Aug 4, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pv You've written:
"Also, the bounds would need to be normalized."
and in the original issue:
"Automatic conversion of the "old-style" ... bounds specifications to the "new" format"
I think the code ending two lines above your comment already addresses this. The new solver accepts old-style bounds, and the old solvers accept new-style bounds. The tests I wrote use old-style bounds with the new solver. What do you mean when you write that the bounds needs to be normalized/converted? It seems this was already done.

I did want to put this all in a separate function, but instead I followed the style of the bounds conversion code and left it inline. I suppose I'll move both to separate functions in _constraints.py.

If I'm going to do this, I'd prefer to take all of the logic out and just write functions:
bounds = standardize_bounds(bounds, method)
constraints = standardize_constraints(constraints, method)
which makes the bounds/constraints appropriate for the method.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding conversion in the other direction, you mentioned in the original issue that an error message would also be acceptable "if [conversion is] not possible". I took "if that's not possible" to mean if it is not implemented. It's not implemented, so I raised an error. I would prefer to leave that for a separate PR because I did not originally intend to implement it and it will be a bit more involved. This PR is for making "trust-constr accept constraint dict".

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 4, 2018

I can't go any further with this until I can get my development setup fixed.
Version:
1.2.0.dev0+5e8860f 1.15.0 sys.version_info(major=3, minor=6, micro=4, releaselevel='final', serial=0)
Installed according to the instruction video I made:
https://www.youtube.com/watch?v=1rPOSNd0ULI&feature=youtu.be
Ensured that everything, including OS, xcode tools, homebrew, gcc, conda, python, scipy dependencies, and scipy source are up to date.
python runtest.py -v has errors collecting tests, ending in:

_________________________________ ERROR collecting build/testenv/lib/python3.6/site-packages/scipy/stats/tests/test_tukeylambda_stats.py _________________________________
ImportError while importing test module '/Users/matthaberland/Desktop/scipydev/scipy/build/testenv/lib/python3.6/site-packages/scipy/stats/tests/test_tukeylambda_stats.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
scipy/stats/__init__.py:353: in <module>
    from .stats import *
scipy/stats/stats.py:171: in <module>
    import scipy.special as special
scipy/special/__init__.py:643: in <module>
    from .basic import *
scipy/special/basic.py:18: in <module>
    from . import specfun
E   ImportError: dlopen(/Users/matthaberland/Desktop/scipydev/scipy/build/testenv/lib/python3.6/site-packages/scipy/special/specfun.cpython-36m-darwin.so, 2): Library not loaded: /usr/local/opt/gcc/lib/gcc/7/libgfortran.4.dylib
E     Referenced from: /Users/matthaberland/Desktop/scipydev/scipy/build/testenv/lib/python3.6/site-packages/scipy/special/specfun.cpython-36m-darwin.so
E     Reason: image not found
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 168 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
=============================================================== 21 deselected, 168 error in 16.87 seconds ================================================================

Thoughts on how to fix?
Maybe I need to downgrade gcc. I can't find that gcc folder:
/usr/local/opt/gcc/lib/gcc/7/
there is instead
/usr/local/opt/gcc/lib/gcc/8/

@pv
Copy link
Member

pv commented Aug 4, 2018

"rm -rf build"? If it's not that, probably something with compiler/libraries setup is wrong, but I don't know about osx.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 4, 2018

It seems to be working after installing both gcc 7 and gcc 8 and moving some folders around...
Yeah I was trying to force it to rebuild everything - I suppose deleting the build folder will do that.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 4, 2018

@pv "The other direction should also be added..."

Equality constraints and inequality constraints must be separated in the old constraint dictionaries, however they are (in general) specified together in the new constraint classes - an equality constraint is when the new-style constraint lower bound and upper bound are equal.

I see three options:

  1. Do the naive thing. The constraint function specified in the new constraint will, in general, be evaluated twice: once for the old-style equality constraint, and once for the old-style inequality constraint.
  2. Do something fancy to cache results of constraint function evaluation so that it is not actually called twice even when the old-style equality and inequality constraints are evaluated separately.
  3. Convert all new-style constraints to old-style inequality constraints. (New-style equality constraints will be expressed as two old-style inequality constraints.)

The latter might sound appealing, but without knowing the details of SLSQP, I'm not sure whether that will work very well in practice. (I'm assuming that strategy wouldn't work well for COBYLA, otherwise why doesn't it accept equality constraints and automatically convert them into pairs of inequality constraints?)

I repeatedly expressed my concerns over abandoning the old-style constraint dictionaries during the development of trust-constr. However, now that we have what we have, I propose leaving the error message (which I already added), prompting the user to express constraints in the old style if they wish to use methods other than trust-constr.

@pv
Copy link
Member

pv commented Aug 5, 2018

The current API inconsistency I think makes no sense, and needs to be addressed. I regret I did not pay close attention to the details here at an earlier time, but it is still possible to fix this.

It is simplest to take the naivest option (e.g. evaluate the function many times if needed). Since the user can specify a list of Constraint objects, the performance issue can be avoided if necessary.

There are two options: convert only those Constraint objects to equality constraints where all of the bounds are equal, or to try to do the conversion piecemeal. The simpler option is likely the former. If the user wants equality constraints, those need to be specified in a seprate Constraint object insofar the "new" API is concerned.

EDIT: also the option of evaluating multiple times and picking out the right parts by indexing is probably fine, doesn't probably need that much code.

The "raise error on wrong way to specify constraints" is not good, because the whole point of the minimize API was to provide an unified way to specify and solve optimization problems. This idea should be still followed, as the whole thing makes no sense otherwise, and reasons for keeping the discrepancy appear not strong. Performance is a secondary concern, and as noted above, there does not appear to be an unavoidable performance cost implied.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 5, 2018

Ok, I'll fix it. Yeah I planned to separate them with arbitrary mix. Yay fancy indexing.

@pv
Copy link
Member

pv commented Aug 5, 2018

Thanks! It's of course possible to split the problem to multiple PRs (or tell me to do it myself :), if you like.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 5, 2018

@pv I think all the new to old constraint conversion functionality is there except conversion of constraint Jacobian. I waited because in the specification for new-style nonlinear constraints, the user can specify an approximation strategy rather than a callable. I see three options:

  1. Convert only user-specified Jacobians to the old-style dict if they are callable; if the user specifies an approximation strategy as a string it will be ignored and the optimization method's default approximation strategy will be used
  2. Convert whatever Jacobian is part of the new-style constraint to the old-style dict. If the user specifies an approximation strategy (written for trust-constr) or leaves the default ('2-point'), the approximation strategy will be used.
  3. Something like option 2 but giving the user some way to also elect to use the optimization method's default approximation strategy.

The other question is what to do about COBYLA. Right now none of the tests (written with SLSQP in mind) pass using COBYLA. Some of this is because COBYLA can't handle bounds (which really should be automatically converted to inequality constraints, right?) or equality constraints; other times COBYLA converges to a different local minimum (presumably) or doesn't converge. Not sure what to do here.

Suggestions for additional tests would be welcome!

@antonior92 antonior92 self-assigned this Aug 6, 2018
@antonior92
Copy link
Member

antonior92 commented Aug 10, 2018

First of all, thank you very much for taking the initiative and helping fixing this @mdhaber!

The next diagram summarizes the current status of how minimize deals with constraints:

classdiagram

As pointed out in issue #9043, we don't have a direct arrow between new constraint classes and old constraint classes, and this goes against the entire principle of minimize interface.

I would like to point out that I see two basic approaches here:

  1. The first approach is the one Matt has taken, which consists of creating functions for converting between the new constraints and the old ones
    mattconstraintdiagram
    This approach has the major advantage of avoiding having to change the most critical parts of the scipy optimization code. However it has three main problems, the way I see it: i) The function new_constraint_to_old is reimplementing many of the functionalities implemented inside CanonicalConstraint. ii) You end up with inefficiencies like the one Matt have mentioned above. This kind of inefficiency is the result of having to convert first to a format that was created to be user-friendly, to them convert again to the efficient format that is actually used by the solver; iii) You can't share features (like derivatives approximation) among the different solvers, since each one of them is basically reimplement the routines to "parse" the constraints.
  2. The second approach is to use the class CanonicalConstraint everywhere.
    newconstraintdiagram
    This would basically convert everything to canonical constraint and them pass this constraint to the solvers. Since this format is much closer to what is actually used by both SLSQP and COBLYA, it would probably avoid the re-implementation of many things and it would avoid having to compute any function twice. It would also allow solver to share features (like derivatives approximation). The major disadvantage of this approach, is that we would have to modify _minimize_slsqp and _minimize_cobyla internally. And changing well tested core scipy code is kind of dangerous - for instance, look at my failure in trying to fix BFGS, which resulted in a problem in libraries depending in scipy (Issue BUG maybe: fmin_bfgs possibly broken in 1.0 #7959)

The second solution looks more elegant and efficient to me. It would also avoiding repeating code in different constrained optimizer. Nevertheless it has the dangerous I just mentioned.

Probably the best path to take here is to finishing merging this PR implementing the strategy (1), accepting eventual inefficiencies and redundancies in the code. In a latter PR, if we want to solve these inefficiencies and improve the code quality, we implement (2).

What do you think Matt?

@antonior92
Copy link
Member

@nmayorov I think your opinion here would be useful as well

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 10, 2018

Sure, reworking SLSQP and COBYLA so that they use the new constraint classes sounds best (if done perfectly). To maintain backwards compatibility, you could leave in their existing constraint handling code, too, or you could use the old-to-new converter that's part of this PR, which is pretty efficient.

But that sounds like a lot of work. I was trying to do something non-invasive that I could finish last weekend.

So I think I'd be happy to finish this PR right now and leave all that to future work, unless you are offering to do what you propose in time for 1.2.

If not, the big question I need answered is how we would prefer to transfer Jacobian specifications from new-to-old per my comment above. In that case, I'd appreciate your thoughts on that, and let's move discussion of a future PR to a new issue

@antonior92
Copy link
Member

antonior92 commented Aug 10, 2018

I agree with you: it is best to finish this PR now and leave harder modifications, as the ones I just described, for latter PR (to be integrated in scipy >=1.3).

About the jacobian: I think the best option is : Convert only user-specified Jacobians to the old-style dict if they are callable; if the user specifies an approximation strategy as a string it will be ignored and the optimization method's default approximation strategy will be used. This is probably easier to implement and consistent with the idea of leaving the deeper integration between the solvers to deeper layers of the code (to be implemented in the future).

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 10, 2018

@pv What do you think about Jacobian conversion (re: here)?

@pv
Copy link
Member

pv commented Aug 10, 2018

Yes, I agree it would be better to modify SLSQP and COBYLA to work with the new constraint format directly. However, we'd still need the old->new format conversion code. Revising SLSQP/COBYLA is likely not that hard in the end, as the inputs to the fortran routines are well-defined (and hence it's not directly a modification of the algorithmic code, so I'm hoping there are no weird interactions).

Constraint Jacobians: SLSQP uses naive numerical differentiation implemented on the Python side if user-provided routine is not given, and COBYLA doesn't need the jacobian. As such, there's no need to allow using the "default numerical jacobian", as what's implemented in trust-constr is likely better. So option 2) is probably better for the new->old conversion.

@pv
Copy link
Member

pv commented Aug 10, 2018

On COBYLA: for this PR, I'd suggest skipping tests for bounds since it's not supported currently.

For convergence to different solutions --- if they're local minima, maybe multiple solutions can be allowed, or you can try switching to more trivial test problems for this PR.

@antonior92
Copy link
Member

Matt, if you decide to go for (2) it is probably better if you take a look on the class PreparedConstraints, that is where the numerical diferentiation is implemented...

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 11, 2018

@pv you write:
"Yes, I agree it would be better to modify SLSQP and COBYLA to work with the new constraint format directly."
This suggests that I should abandon new-to-old conversion.
Then there is:
" So option 2) is probably better for the new->old conversion"
Which suggests that I complete it.

Are you agreeing that I should complete this PR and leave reworking the other optimizers for future work?

@antonior92 thanks, I know. That's what I had in mind.

@pv
Copy link
Member

pv commented Aug 11, 2018

I mean as a long term goal, it would be better to use only one format internally.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 12, 2018

@antonior92 I tried using the PreparedConstraint class as I had planned. The primary reason was that its field fun has a field jac that I can pass in as the Jacobian in the old constraint dictionary, right?

For example, if con is a NonlinearConstraint and x0 is the guess, then:

pcon = PreparedConstraint(con, x0)
pcon.fun.fun # should be basically con.fun
pcon.fun.jac # should approximate the jacobian of con.fun

Correct?

Well before I even got to using pcon.fun.jac I replaced use of con.fun in my code with pcon.fun.fun and some of the test problems started failing. The Elec problem found a different, but equally optimal, solution (I think), but Maratos started converging to a different solution with an objective value of -0.5 instead of -1. Even when I play with the guess, it refuses to converge to the correct solution.

Any thoughts on why that would be? What is going on inside the PreparedConstraint's fun.fun that causes it to behave differently from the original NonlinearConstraint's fun?

@antonior92
Copy link
Member

Hi Matt,

Actually, I introduce some memoization logic inside "PreparedConstraint" (take a look here). This is necessary, so I can efficiently approximate the derivatives (without function recomputation).

This is probably responsible for the problem... I didn't found the exact source of the problem though...

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 17, 2018

I read that and figured that sort of thing might be going on. The behavior of SLSQP in this situation is similar to what I've seen it do when the constraint function is unsuitable for finite difference derivative approximation.

I am not using the attributes, only the public methods, so based on the two bullet points in the docstring I thought I would not be affected by any peculiarities.

I seemed to be affected nonetheless, so without luck passing only the PreparedConstrsint fun.fun into the old-style constraint dict, I tried also passing the fun.jac as the Jacobian. Still no luck. If these approaches are wrong, I'll need you to help me think through it. Keep in mind I have read only the docstring; I don't plan on studying the class code.

Without diving into the innards of SLSQP and COBYLA to see how they work, I need to provide them functions for evaluating the constraint and it's Jacobian for arbitrary inputs, without any unusual behavior. When I call the function with a given argument, it returns the appropriate information. Very simple.

How do I persuade a PreparedConstraint to give me what I need?

@antonior92
Copy link
Member

When I call the function with a given argument, it returns the appropriate information.

Matt, I will need to take a look. Because I thought VectorFunction (which is being used inside PreparedConstraint) was doing exactly that. The problem is probably that the order being used inside SLSQP is different than the one being used in trust-constr, and that is revealing some new problems with this function.

Probably the best way to fix this is to find the situation when the VectorFunction doesn't match the true function value (some minimal example) and debug from there.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 17, 2018

I already tried a for loop that passed random arrays into PreparedConstraint's version of the function and it always seemed to match the original function.

But clearly it's not working inside SLSQP. Could it be something to do with the values SLSQP passes in, maybe due to the small changes when it's trying to approximate derivatives? Do you do something like use np.allclose to detect whether the input is the same as before and used the cached value instead of re-evaluating?

If you don't know, at some point I can get back to this and perform a proper test that compares the original function w/ PreparedConstraint's with every input from SLSQP.

@antonior92
Copy link
Member

Could it be something to do with the values SLSQP passes in, maybe due to the small changes when it's trying to approximate derivatives? Do you do something like use np.allclose to detect whether the input is the same as before and used the cached value instead of re-evaluating?

I don't think this is the problem, because I only reuse the value when I have an exact match (np.array_equal).

If you don't know, at some point I can get back to this and perform a proper test that compares the original function w/ PreparedConstraint's with every input from SLSQP.

Ok, I think we will need to be able to identify exactly when the mismatch happens. Then I can debug it and fix the problem on _differentiable_functions.py.

My guess is that it has probably something to do with the order we call the functions...

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 18, 2018

If you tell me how to import PreparedConstraint, I can provide a complete reproducible example. Otherwise you're going to have to checkout my branch.
Never mind, I guess I can import it if my working directory is in scipy.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 19, 2018

Never mind again. You should checkout my branch. (I can't get a stand-alone example working properly. It's working even more bizarrely.)

from scipy.optimize.tests.test_minimize_constrained import Maratos
prob = Maratos()
result = minimize(prob.fun, prob.x0,
                  method='slsqp',
                  bounds=prob.bounds,
                  constraints=prob.constr)

Look what's going on in _constraints.py around line 342. It's calculating both versions of the constraint function for each input requested by SLSQP, comparing them, and printing the results to the console. Sometimes they're the same, sometimes not. It's currently returning the correct version, so optimization converges. If you have it return y2, it converges to an incorrect solution that depends on the guess.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 20, 2018

@antonior92 Similar issue with LinearVectorFunction.

@pv Issues with the VectorFunction and LinearVectorFunction classes are preventing the use of their Jacobian approximations in the old-style constraint dictionaries. This means I can't do option 2 "Convert whatever Jacobian is part of the new-style constraint to the old-style dict" until it is fixed.
Can I finish this PR with option 0) ignoring Jacobians provided in a new-style constraint, then when the above issue is resolved make a new PR to enable option 2?

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 21, 2018

I think I didn't do that very elegantly. Oops.

@antonior92
Copy link
Member

hahahaha I have done the same thing once... It seems so easy. But it end up merging the master into your branch.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 22, 2018

@pv I think think this is ready for comments.

Copy link
Member

@pv pv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks fine to me. Only some (non-blocking) nitpicks.

However, there appears to be a merge gone bad (git rebase was followed by git merge or git pull of the old branch -> duplicated commits). I'll disentangle this and force-push it back now. Done



def new_constraint_to_old(con, x0):

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a one-sentence (or so) docstring stating what the function does, even if this is in principle clear from the name.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check



def old_constraint_to_new(ic, con):

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One-sentence docstring, and similarly to the other functions.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check

old_constraints = new_constraint_to_old(con, x0)
constraints[i] = old_constraints[0]
additional_constraints = old_constraints[1:]

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd rewrite this as

if meth == 'trust-constr':
    for i, con in enumerate(constraints):
        if not isinstance(con, new_constraint_types):
            constraints[i] = old_constraint_to_new(i, con)
else:
    for i, con in enumerate(list(constraints)):
        if isinstance(con, new_constraint_types):
            old_constraints = new_constraint_to_old(con, x0)
            constraints[i] = old_constraints[0]
            constraints.extend(old_constraints[1:])

return constraints

Note that it is allowed to modify list contents during enumerate (but adding new items should be avoided).

Stylistically, it's better to have the meth == ifs on top level since the operations are different.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pv agree with meth== on top level so that the branching only has to happen once.

But I don't understand how your code constraints.extend(old_constraints[1:]) agrees with your comment "...adding new items should be avoided". I had the additional_constraints list that gets extended (via +=) only after the loop is done because I wanted to avoid adding new items during the loop.

Which would you prefer?

(in any case, thanks for pointing it out, because it looks like there was a bug in my code here. additional_constraints = old_constraints[1:] should have been additional_constraints.extend(old_constraints[1:]))

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, why did you write for i, con in enumerate(list(constraints)): rather than constraints = list(constraints)? I wanted to ensure that constraints is a (mutable) list as I imagine the user could have passed in some other sort of sequence of constraints.

Copy link
Member

@pv pv Aug 29, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's somewhat simpler to iterate over a copy of the list and mutate the original,
rather than iterating over the original and mutating the copy.
The two are not the same:

for i, cons in enumerate(list(constraints)):
    # mutate constraints

and

constraints = list(constraints)
for i, cons in enumerate(constraints):
    # mutate constraints

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know they're different.
That approach makes sense.
I didn't see before that we were using list (constraints) in different ways. I was using it to ensure that it is mutable. You were using it to create a copy to iterate over. I think we actually need both.
Thanks!

return NonlinearConstraint(fun, lb, ub, jac)


def standardize_bounds(bounds, x0, meth):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These two routines should be preferably moved to _minimize.py.
The meth names should preferably only be referred to there.

jac = lambda x: A

# when bugs in VectorFunction/LinearVectorFunction are worked out, use
# pcon.fun.fun and pcon.fun.jac. Until then, get fun/jac above.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add FIXME: to the comment.

@pv
Copy link
Member

pv commented Aug 24, 2018

I'll wait for CIs then merge.

@mdhaber
Copy link
Contributor Author

mdhaber commented Aug 24, 2018

Ok. After merge, I'll get my branch up to date with SciPy master and address your comments in a new pr?

@pv pv merged commit 12f22bc into scipy:master Aug 25, 2018
@pv
Copy link
Member

pv commented Aug 25, 2018

Thanks, merged!

Yes, maybe easiest to do remaining things in separate PR.

@pv pv added this to the 1.2.0 milestone Aug 25, 2018
@pv
Copy link
Member

pv commented Aug 25, 2018

Re: the strange behavior with SLSQP.

SLSQP modifies the x vector it gives to callbacks later on, so maybe that's the reason for the strange behavior with the caching you see.

This is a bit bad behavior, and probably one should change here https://github.com/scipy/scipy/blob/master/scipy/optimize/slsqp.py#L379

x_copy = np.copy(x)
fx = func(x_copy)
...

and similarly x -> x_copy for all the rest of places where x is used as a callback argument.
We do this for the user-provided callback function, but not for objective/constraints
https://github.com/scipy/scipy/blob/master/scipy/optimize/slsqp.py#L430

@pv
Copy link
Member

pv commented Aug 25, 2018

Or better instead, at https://github.com/scipy/scipy/blob/master/scipy/optimize/slsqp.py#L426

x = np.copy(x)
slsqp(m, meq, x, xl, xu, fx, c, g, a, acc, majiter, mode, w, jw)

mdhaber added a commit to mdhaber/scipy that referenced this pull request Aug 30, 2018
rgommers added a commit that referenced this pull request Aug 31, 2018
MAINT: optimize: fixed @pv style suggestions from #9112
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants