Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: scipy.optimize - adding differential_evolution #3446

Merged
merged 43 commits into from
May 16, 2014
Merged

Conversation

andyfaff
Copy link
Contributor

Differential evolution (DE) is an algorithm that is widely used for minimizing functions that have multiple global minima. As such it is widely used in curvefitting applications. This pull request adds the differential_evolution function to scipy.optimize, and will complement basinhopping.

[1] Storn, R and Price, K, Differential Evolution - a Simple and
Efficient Heuristic for Global Optimization over Continuous Spaces,
Journal of Global Optimization, 1997, 11, 341 - 359.
[2] http://en.wikipedia.org/wiki/Differential_evolution

@andyfaff
Copy link
Contributor Author

@rgommers
Copy link
Member

@andyfaff there's some merge conflict already. Can you rebase on current master? That'll make the tests on TravisCI run. And test coverage will be reported.

@rgommers
Copy link
Member

For future commits, commit messages in the form described at http://docs.scipy.org/doc/numpy-dev/dev/gitwash/development_workflow.html#writing-the-commit-message would be good to have.

@rgommers
Copy link
Member

Also, you have a bunch of commit messages that are (almost) exact duplicates of each other. Would be good to squash those.


__all__ = ['differential_evolution']

MACHEPS = 2.22e-16
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

np.finfo(np.float64).eps is less magical.

@rgommers
Copy link
Member

I made some initial comments based on a quick read. Will try to play with this later, kind of low on bandwidth now. Anyway looks promising.

@andyfaff
Copy link
Contributor Author

Thanks for comments, a large proportion are due to newness to git...
EDIT: Will clean up commits and make them sane after rebasing on master. Learning curve of git is quite steep.

@andyfaff
Copy link
Contributor Author

@dlax, I don't know why the Travis is failing on the Python2.6 machine.
My git commands were:

git checkout diffev
git fetch scipymaster
git rebase scipymaster/master
git push origin diffev

Here is my remote list:

git remote -v

origin https://github.com/andyfaff/scipy.git (fetch)
origin https://github.com/andyfaff/scipy.git (push)
scipymaster https://github.com/scipy/scipy.git (fetch)
scipymaster https://github.com/scipy/scipy.git (push)

When I look at gitk the code was rebased onto scipy/master
732c0f7. That's the very tip of the
scipy/master branch. The test is failing in code that's not been touched
by me.

On 15 May 2014 18:29, Denis Laxalde notifications@github.com wrote:

@andyfaff https://github.com/andyfaff I'd like to have a clean travis
build before merging. Could you rebase your branch on top of current master
and push it again? (I just tried, it workds smoothly without conflict.)

git checkout diffev
git fetch upstream # assuming upstream is the name of the official scipy repo
git rebase upstream/master
git push origin diffev # origin being your clone on github


Reply to this email directly or view it on GitHubhttps://github.com//pull/3446#issuecomment-43181548
.


Dr. Andrew Nelson


@argriffing
Copy link
Contributor

The failing test is in code that I wrote which looks completely unrelated. It looks like a NaN sneaked into a test matrix, and earlier versions of python/numpy do not appreciate taking abs of NaN?

@andyfaff
Copy link
Contributor Author

So why aren't other recent Travis tests failing?

On 16 May 2014 11:38, argriffing notifications@github.com wrote:

The failing test is in code that I wrote which looks completely unrelated.
It looks like a NaN sneaked into a test matrix, and earlier versions of
python/numpy do not appreciate taking abs of NaN?


Reply to this email directly or view it on GitHubhttps://github.com//pull/3446#issuecomment-43286806
.


Dr. Andrew Nelson


@argriffing
Copy link
Contributor

So why aren't other recent Travis tests failing?

I have no idea. The failing test uses random numbers, but it sets a seed.

@andyfaff
Copy link
Contributor Author

This PR involves a lot of random numbers and initialising of
numpy.random.RandomState. I wonder if that's related?

On Friday, May 16, 2014, argriffing notifications@github.com wrote:

So why aren't other recent Travis tests failing?

I have no idea. The failing test uses random numbers, but it sets a seed.


Reply to this email directly or view it on GitHubhttps://github.com//pull/3446#issuecomment-43287829
.


Dr. Andrew Nelson


@pv
Copy link
Member

pv commented May 16, 2014

NaNs popping out from nowhere can be a symptom of mismatching C prototypes (e.g. double foo() declared as int foo() so that FP flags aren't properly cleared on return).

This cannot be due to this PR, as it only touches Python code. However, if this is the cause, the error can be anywhere in Scipy codebase (including any libraries we use --- there were bugs like this in the SuperLU code). One typical pattern is that the failure disappears if you run only a part of the test suite.

Another possibility is that this is something Numpy 1.5.1 related.

@andyfaff
Copy link
Contributor Author

This commit is pure python.
On 16/05/2014 5:59 PM, "Pauli Virtanen" notifications@github.com wrote:

NaNs popping out from nowhere can be a symptom of mismatching C prototypes
(e.g. double foo() declared as int foo() so that FP flags aren't properly
cleared on return).


Reply to this email directly or view it on GitHubhttps://github.com//pull/3446#issuecomment-43306425
.

@pv
Copy link
Member

pv commented May 16, 2014

Yes, this class of bugs is exactly one that can be exposed by unrelated changes that do not actually cause the issue.

@pv
Copy link
Member

pv commented May 16, 2014

OTOH, now that I think of it, these issues were previously always specific to i386, (cf eg 4796e0d) whereas the failing build bot is 64-bit. I don't recall if this can occur on 64-bit platforms as the calling convention is IIRC different...

The reason why the NaN appears should be traced to resolve this.

@andyfaff
Copy link
Contributor Author

Unfortunately I don't know enough about the scipy/numpy code base to go
hunting for the cause myself.
On 16/05/2014 6:14 PM, "Pauli Virtanen" notifications@github.com wrote:

OTOH, now that I think of it, these issues were previously always specific
to i386, (cf eg 4796e0dhttps://github.com/scipy/scipy/commit/4796e0d0a0f4)
whereas the failing build bot is 64-bit. I don't recall if this can occur
on 64-bit platforms as the calling convention is IIRC different...


Reply to this email directly or view it on GitHubhttps://github.com//pull/3446#issuecomment-43307442
.

@argriffing
Copy link
Contributor

This PR involves a lot of random numbers and initialising of
numpy.random.RandomState. I wonder if that's related?

This seems likely to me. If your PR is changing the random numbers in unrelated tests, then the failing expm_multiply test might be getting an unlucky random matrix. If I had more time I would try to make that particular test more robust.

@pv
Copy link
Member

pv commented May 16, 2014

The failing test sets the seed of the random generator, so it seems unlikely to me that this is due to differences in the random sequence. Anyway, in my opinion this does not block merging, since it's clearly not something introduced by this PR.

I'm also not able to reproduce the issue on my machine.

dlax added a commit that referenced this pull request May 16, 2014
ENH: scipy.optimize - adding differential_evolution
@dlax dlax merged commit 6d9801b into scipy:master May 16, 2014
@dlax
Copy link
Member

dlax commented May 16, 2014

Ok, merged. Thanks @andyfaff and everyone involved in the review.

@argriffing
Copy link
Contributor

@andyfaff The description of this new global optimization feature could be added to https://github.com/scipy/scipy/blob/master/doc/release/0.15.0-notes.rst if you want to write a blurb?

@argriffing
Copy link
Contributor

I tried to debug the failing expm_multiply test, but I've not been able to replicate it.

One problem is that I can't find the TravisCI traceback anymore, but I vaguely remember it was an abs failure in the exact one-norm in one of the scaled expm_multiply tests. I've instrumented the exact one-norm by checking that it doesn't have nans and I've increased the number of scaled tests from 10 to 1000 but I still haven't seen any weirdness of the kind that would have triggered that test failure.

@andyfaff
Copy link
Contributor Author

Sure.
On 17/05/2014 3:38 AM, "argriffing" notifications@github.com wrote:

@andyfaff https://github.com/andyfaff The description of this new
global optimization feature could be added to
https://github.com/scipy/scipy/blob/master/doc/release/0.15.0-notes.rstif you want to write a blurb?


Reply to this email directly or view it on GitHubhttps://github.com//pull/3446#issuecomment-43358687
.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants