Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable binary_blobs working on anisotropic meshes #5160

Open
patquem opened this issue Dec 27, 2020 · 2 comments
Open

Enable binary_blobs working on anisotropic meshes #5160

patquem opened this issue Dec 27, 2020 · 2 comments

Comments

@patquem
Copy link
Contributor

patquem commented Dec 27, 2020

Hi,

Presently skimage.data.binary_blobs is limited to isotropic meshes (through the argument : length)
This limitation can be in some cases very constraining.

Indeed, suppose you wish to generate blobs on a array of size (1000, 1000, 10) , it's not really convenient to have to generate blobs on an array of size (1000, 1000, 1000) then to truncate it - the generation is too long !!!

Fortunately, isotropic constraint on code could be raised "easily".
Given an argument lengths as (length_0, length_1, length_2 ..) for each direction, instead of 'length' and 'n_dim' arguments, it's just necessary to change the binary_blobs function with the 4 following new lines to generalize it to anistropic meshes

n_dim = len(lengths) #                                                             -> 1rst new line
length = max(lengths) #                                                            -> 2nd new line

rs = np.random.RandomState(seed)
shape = tuple([length] * n_dim)
mask = np.zeros(shape)
n_pts = max(int(1. / blob_size_fraction) ** n_dim, 1)
points = (length * rs.rand(n_dim, n_pts)).astype(np.int)
mask[tuple(indices for indices in points)] = 1
for i, ind in enumerate(lengths):  #                                               -> 3rd new line
    mask = np.delete(mask, np.s_[ind:], i)  #                                      -> 4th new line
mask = gaussian(mask, sigma=0.25 * length * blob_size_fraction)
threshold = np.percentile(mask, 100 * (1 - volume_fraction))
return np.logical_not(mask < threshold)

Feel free to adapt the 2 last new lines with something more clever :)

Patrick

@patquem
Copy link
Contributor Author

patquem commented Dec 28, 2020

Hi again,

Looking more closely, the previous proposal fails in preserving volume_fraction property when working with strong and large anistropic meshes.

Assuming a (100,100,1) mesh, 99% of points are removed (3rd and 4th lines) and the major contribution of the 'neighbor' points during the gaussian convolution is missing on the 'slice'.

This problem is clearly visibile when working with strong anisotropic meshes (with few points in one axis).
I think, this kind of problem is also present on isotropic meshes, near borders, since "external" points are not considered to ensure the continuity of blobs density. (surrounding "ghost cells" could fix the problem, but it's another question ... )

Patrick

@scikit-image scikit-image locked and limited conversation to collaborators Oct 18, 2021
@scikit-image scikit-image locked and limited conversation to collaborators Oct 18, 2021
@scikit-image scikit-image unlocked this conversation Feb 20, 2022
@grlee77
Copy link
Contributor

grlee77 commented Feb 20, 2022

opening as I'm not sure this was intentionally closed?

@grlee77 grlee77 reopened this Feb 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants