New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cwatershed fails on large 3D images (>2**31 voxels) #102
Comments
Wow, I thought 2³¹ voxels should be enough for everybody, but I guess I was wrong (seriously, I never even thought this could be an issue). Obviously, the fix is easy, so I will just do it. |
I just committed what I think is a fix. Do you have the ability to test the github version and tell me if it works? |
Thanks, will test it today.
…On Mon, 18 Mar 2019, 5:55 AM Luis Pedro Coelho, ***@***.***> wrote:
I just committed what I think is a fix. Do you have the ability to test
the github version and tell me if it works?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#102 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AJEqgF-Zx5Lv3cxxf5p5chzrmA9wLs8Dks5vXw4igaJpZM4b313A>
.
|
Bad news, didn't fix it. Going above 2**31 voxels now leads to a seg fault, no exception error message to give you unfortunately. Here's my very simple test code, if it helps. Change the 512 slices to 511 slices and it seems to work fine. With 512 slices, it seg faults.
|
I was trying to test this, but it's incredibly slow to work with such very large matrices on the only machine to that I have access to that has enough memory for this to work (it's a pretty old and slow machine, but has a lot of memory). I switched a few remaining uses of int (32 bit s) to npy_intp (64 bits), so this may now work (if you wait long enough, at least it hasn't crashed yet). |
Thanks, I'll try to test it. I don't have access to the computer I was
using, but I'll see what I can do.
…On Mon, 8 Apr 2019 at 05:33, Luis Pedro Coelho ***@***.***> wrote:
I was trying to test this, but it's incredibly slow to work with such very
large matrices on the only machine to that I have access to that has enough
memory for this work is pretty old and slow. I switched a few remaining
uses of int (32 bit s) to npy_intp (64 bits), so this may now work (if you
wait long enough, at least it hasn't crashed yet).
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#102 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AJEqgHSAc-3HU3EGpFcaS1cHgeheyulAks5veriVgaJpZM4b313A>
.
|
Bugfix release Full `ChangeLog`: * Make watershed work for >2³¹ voxels (issue #102) * Remove milk from demos * Improve performance by avoid unnecessary array copies in ``cwatershed()``, ``majority_filter()``, and color conversions * Fix bug in interpolation
Bugfix release Full `ChangeLog`: * Make watershed work for >2³¹ voxels (issue #102) * Remove milk from demos * Improve performance by avoid unnecessary array copies in ``cwatershed()``, ``majority_filter()``, and color conversions * Fix bug in interpolation
Bugfix release Full `ChangeLog`: * Make watershed work for >2³¹ voxels (issue #102) * Remove milk from demos * Improve performance by avoid unnecessary array copies in ``cwatershed()``, ``majority_filter()``, and color conversions * Fix bug in interpolation
I was attempting to process a 2304x2304x2475 voxel tomogram using
cwatershed
. I'd keep getting an array of zeros out. I eventually found that just processing 100 slices worked fine, so I kept increasing the number of slices until it failed. It seems like the issue is that as soon as the no. of voxels exceeds 2**31,cwatershed
fails (either giving an array of zeros, or once even causing Python to segfault). Looking at_morph.cpp
, I think the problem is thatint
is used for indices, e.g. line 599, instead ofint64_t
oruint64_t
.The text was updated successfully, but these errors were encountered: