Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Allow multi-iter objects to handle more iterators. #226

Closed
wants to merge 3 commits into
from

Conversation

Projects
None yet
5 participants
Contributor

walshb commented Mar 5, 2012

Hi

This is some sort of fix for http://mail.scipy.org/pipermail/numpy-discussion/2011-July/057831.html.

(By the way, the test above runs pretty slowly because of constructing a big array from the inputs).

But after that, it seems to work.

Cheers

Ben

@charris charris and 1 other commented on an outdated diff Mar 6, 2012

numpy/core/src/multiarray/iterators.c
@@ -1503,6 +1503,25 @@
return 0;
}
+void
+multiter_allociters(PyArrayMultiIterObject *multi, int n)
+{
+ int i;
+
+ multi->numiter = n;
+
+ if (n <= 0) {
+ multi->iters = NULL;
+ return;
+ }
+
+ multi->iters = PyArray_malloc(n * sizeof(PyArrayIterObject *));
@charris

charris Mar 6, 2012

Owner

I think this needs an error check. And also everywhere multiiter_allociters is called.

@charris

charris Mar 6, 2012

Owner

Also, the argument to PyArray_malloc should be size_t. You might want some error check for negative values somewhere down the line.

@mwiebe

mwiebe Mar 6, 2012

Member

The argument to PyArray_malloc should be ok, since the sizeof operator returns a size_t. The error check, and returning an error code from this function are definitely necessary though

@charris charris commented on an outdated diff Mar 6, 2012

numpy/core/src/multiarray/iterators.c
@@ -1503,6 +1503,25 @@
return 0;
}
+void
+multiter_allociters(PyArrayMultiIterObject *multi, int n)
@charris

charris Mar 6, 2012

Owner

I suspect this shoud be static, as well as have an error return.

Owner

charris commented Mar 6, 2012

Wouldn't it be simpler to just increase NPY_MAXARGS? I don't know if we want to spend too much time on this iterator as it should be replaced in the future with nditer.

Member

mwiebe commented Mar 6, 2012

The email referenced indicated that some legacy code was using 16 million arguments, which would definitely need dynamic allocation to handle. Because this change is pretty simple, I think it's fine to apply once it's fixed up. Removing the NPY_MAXARGS limitation in nditer is slightly more involved, and I don't want to tackle that in C, it's more appropriate for C++.

A test which calls choose with 100 arguments or so should be added to this pull request, to demonstrate that it's working correctly.

Contributor

walshb commented Jun 24, 2012

Well, finally got round to dealing with the remaining problems with this. Hope it comes in handy for someone...

Cheers

This pull request fails (merged f9811d5 into 7d225bb).

Owner

njsmith commented Jul 15, 2012

The failure there is just the memcpy/memmove bug, really this PR passed the tests when they were last run. (Though that was long enough ago that someone should probably re-run them before merging this.)

Owner

charris commented May 3, 2013

@walshb Sorry for the long hiatus. Could you rebase this?

Owner

charris commented Aug 16, 2013

@walshb Pretty much bitrotted at this point. The 1.8 branch will be made Aug 18.

Owner

charris commented Mar 28, 2014

Closing this on account of staleness. @seberg Sounds like choose could use the select treatment.

@charris charris closed this Mar 28, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment