simplify the signature for optimization wrappers #196

Closed
wants to merge 11 commits into
from

4 participants

@dlax
SciPy member

The idea is to have a more compact signature for the optimization wrappers and to ensure that the return value has always the same type, e.g.:

result = minimize(fun, x0, [jac, hess, constraints], options)

So, changes are:

  • The full_output parameter has been dropped from minimize and minimize_scalar.
  • Only result is returned. The solution can be accessed as result.x.
  • The retall parameter has been moved as an option: options['return_all'].

If this receives a positive feedback, I'll update PR #195 accordingly.

dlax added some commits Apr 18, 2012
@dlax
SciPy member

Define and use InfoDict — a subclass of dict with attributes accessors — as suggested by @pv in PR #195.

@dlax
SciPy member

The default value for options is now None as discussed on the ML.

@jpaalasm jpaalasm and 1 other commented on an outdated diff Apr 22, 2012
scipy/optimize/optimize.py
+ return self[name]
+ except KeyError:
+ raise AttributeError(name)
+
+ __setattr__ = dict.__setitem__
+ __delattr__ = dict.__delitem__
+
+ def __repr__(self):
+ if self.keys():
+ m = max(map(len, self.keys())) + 1
+ else:
+ m = 0
+ return '\n'.join([k.rjust(m) + ': ' + repr(v)
+ for k, v in self.iteritems()])
+
+ def __array__(self):
@jpaalasm
jpaalasm added a line comment Apr 22, 2012

Isn't this unncecessary magic?

@dlax
SciPy member
dlax added a line comment Apr 22, 2012
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@pv pv commented on an outdated diff Apr 23, 2012
scipy/optimize/optimize.py
@@ -60,6 +60,29 @@ def derivative(self, x, *args):
self(x, *args)
return self.jac
+class InfoDict(dict):
+ """ dict subclass with attribute accessors """
@pv
SciPy member
pv added a line comment Apr 23, 2012

Here you can maybe say instead that this class represents information from optimization solver, and list the common attribute names.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
dlax added some commits Apr 24, 2012
@dlax dlax ENH: replace InfoDict by Result
`Result` is the only return from optimization wrappers. The solution is
available through the `x` attribute (previously named `solution` in
`InfoDict`).

All tests updated.
65766fc
@dlax dlax DOC: update optimize tutorial b24d915
@dlax
SciPy member

I've replaced the InfoDict class by Result which is now the only returned value of minimize and minimize_scalar as discussed on the ML. Solution can be accessed through the x attribute (previously named solution).

@charris
SciPy member

Looks like res.keys() is an easy way to see what is available, but that might be worth a comment in the documentation. Alternatively, you could add a show (show_doc?) method or something along those lines that would provide that information if something more specific to the optimization method is needed. Or is a simple res? in IPython sufficient?

@jpaalasm jpaalasm commented on the diff May 6, 2012
scipy/optimize/optimize.py
+
+ Note
+ ----
+ This is essential a subclass of dict with attribute accessors.
+ """
+ def __getattr__(self, name):
+ try:
+ return self[name]
+ except KeyError:
+ raise AttributeError(name)
+
+ __setattr__ = dict.__setitem__
+ __delattr__ = dict.__delitem__
+
+ def __repr__(self):
+ if self.keys():
@jpaalasm
jpaalasm added a line comment May 6, 2012

I would refactor this into

    if len(self):
        max_key_len = max(len(key) for key in self.keys())
        key_format_len = max_key_len + 1
        return '\n'.join([k.rjust(key_format_len) + ': ' + repr(v)
                          for k, v in self.iteritems()])
    else:
        return "Result()"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@jpaalasm jpaalasm and 1 other commented on an outdated diff May 6, 2012
scipy/optimize/optimize.py
@@ -60,6 +60,54 @@ def derivative(self, x, *args):
self(x, *args)
return self.jac
+class Result(dict):
+ """ Result of optimization solvers.
@jpaalasm
jpaalasm added a line comment May 6, 2012

The docstring could be clearer. What about:

Represents the optimization result.

Common attributes
-----------------
x : ndarray
    The optimization solution.
success : bool
    Whether or not the optimizer terminated successfully.
status : int
    Termination status of the optimizer. Its value depends on
    the optimizer. Refer to `message` for details.
message : str
    Description of the cause of termination.
fun, jac, hess : ndarray
    Values of objective function, Jacobian and Hessian (if available).
nfev, njev, nhev: int
    The number of evaluations of the objective functions and of its
    Jacobian and Hessian.
nit: int
    The number of iterations performed by the optimizer.

Note
----
This is essentially a subclass of dict with attribute accessors.
@pv
SciPy member
pv added a line comment May 6, 2012

The section title should also be Attributes so that it interacts well with the documentation machinery. You can mention above that there may also be solver-specific attributes there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@dlax
SciPy member

4c1eec3 includes the suggested improvements on the documentation.

@dlax
SciPy member

Anything else or may I merge this?

@jpaalasm

I still think that having "Result.__array__" return "self.solution" is a bad idea (shouldn't it be self.x, anyway?). It is unpythonic implicit magic in my opinion.

If you need the solution as an array, why not write "result.x"?

@pv
SciPy member
pv commented May 11, 2012

I think I suggested adding the __array__ method. I agree with Joonas that it's a bit strange, in the sense that asarray(solution) behaves as if solution was an array-like object --- although it is not. The additional convenience gained might not be that great either, so from my side, also removing this piece of magick would be OK.

@dlax
SciPy member
@dlax
SciPy member

merged

@dlax dlax closed this May 15, 2012
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment