Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scale array for points gaussian #4366

Merged
merged 26 commits into from
Dec 23, 2023

Conversation

kmarchais
Copy link
Contributor

Overview

Following #2576, I propose this feature to be able to use a scale array to scale points gaussian by a scalars array.
It allows to display a large number of spheres with different sizes.

Example with ~5M particles Zoomed
5million 5million_zoom

Details

  • add scale_array property
  • add scale parameter to add_mesh (can only be the name of the array as a string for now, it could be True to select the active array or the array itself)

@github-actions github-actions bot added the enhancement Changes that enhance the library label May 1, 2023
Comment on lines 3537 to 3543
if scale is not None and style == 'points_gaussian':
if isinstance(scale, str):
self.mapper.scale_array = scale
self.mapper.emissive = emissive
self.mapper.scale_factor = 1.0


Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm weary of this block. Specifically, I'm weary against having the scale argument specifically for the point gaussian mapper and silently ignored for all other scenarios.

Further, if scale is a number, should we set scale_factor with that value?

Additionally, I'm pretty sure it is unnecessary to set the emissive here as it should be set in the PointGaussianMapper instantiation on line ~3217 (~3198 head)

Copy link
Contributor Author

@kmarchais kmarchais May 1, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your comments!

  • Indeed scale is only used with the point gaussian mapper here (like emissive for example) so I'm creating a new condition here because in this case I need to execute the code from the condition with render_points_as_spheres to run use_circular_splat. And I didn't want to move too much code.
  • For scale_factor, do you mean that scale could also be a number and work the way you describe?
  • About emissive, I forgot to explain this, it is set to True in the definition of use_circular_splat, maybe it should be done differently here?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm also +1 on the functionality here being added to PyVista, I think the discussion is around what user interface to this should look like.

I am already confused by the interaction and overlap of all the options in add_mesh, e.g. when does point_size matter? How will this be different than scale in this PR? Do these interact? There are render_points_as_spheres=True/False and style=point_guassian. I believe the former does something different when the style changes. IMO the problem will continue to grow unless we address this as new options are added.

Copy link
Member

@akaszynski akaszynski May 1, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think what we're seeing is "feature creep" in the form of adding parameter after parameter to add_mesh.

I think the original implementation is flawed, and @kmarchais is just following our design patten of adding another kwarg to add_mesh.

How about this: add_points_gaussian. That's explicit and permits us to trim down on args in the already bloaded add_mesh.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with @akaszynski that we need to rethink the design pattern. One path forward for this PR is to align on a name for this parameter, and then break it out into a separate method in the future, unless @kmarchais wants to take this part on. In this case, it would be okay if scale is ignored in all other cases.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add add this for now in this PR and as a follow up implement add_points_gaussian. The only caveat will be that we shouldn't include this in the upcoming release this week, as awesome as this example is.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it still requires some more improvement such as dealing with different types in a similar way as it's done with the scale parameter for glyphs. Also applying emissive here is probably not the best way, but probably not in use_circular_splat neither.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have vague memories of someone planning to reduce the number of add_*() methods. It may have been @banesullivan? I'm not at all sure I remember correctly, and I can't find any related discussion, so just pointing this out because in case I remember correctly, it's better to short-circuit contradictory fundamental design plans like this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That was me, @adeak -- indeed, I want to reduce the number of add_*() methods on the plotter, but mostly in regards to special handling of dataset/mesh types whereas this use case is for special handling of rendering techniques, so I think an add_point_gaussian()-like method would be acceptable. Method like add_lines() and add_points() need to go -- these should be handled by add_mesh(). I have a more thorough explanation of my rationale in some comments somewhere.

+1 for creating a clean compromise to merge this PR for release, then rethinking the API design later.

The confliction between scale and point_size also concerns me... I'm tempted to call this an "advanced" feature and have users set it on the mapper specifically rather than as another kwarg in add_mesh(). With usage looking like:

actor = plotter.add_mesh(mesh, style='points_gaussian', ...)
actor.mapper.scale_array = 'name of array'

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@banesullivan, I've implemented your suggestion.

All, please verify that this API works. I think it balances everyone's requests.

Copy link
Member

@banesullivan banesullivan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an awesome addition and an excellent example! Thanks for the contribution, @kmarchais!

@codecov
Copy link

codecov bot commented May 3, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (6d1583b) 96.38% compared to head (3f66001) 96.40%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #4366      +/-   ##
==========================================
+ Coverage   96.38%   96.40%   +0.01%     
==========================================
  Files         134      134              
  Lines       22721    22727       +6     
==========================================
+ Hits        21900    21909       +9     
+ Misses        821      818       -3     

@akaszynski
Copy link
Member

@kmarchais, please check over the changes and let me know if you're fine with the new API:

>>> import numpy as np
>>> import pyvista as pv
>>> n_spheres = 1_000
>>> pos = np.random.random((n_spheres, 3))
>>> rad = np.random.random(n_spheres) * 0.01
>>> pdata = pv.PolyData(pos)
>>> pdata['radius'] = rad
>>> pl = pv.Plotter()
>>> actor = pl.add_mesh(
...     pdata,
...     style='points_gaussian',
...     emissive=False,
...     render_points_as_spheres=True,
... )
>>> actor.mapper.scale_array = 'radius'
>>> pl.enable_anti_aliasing('ssaa')
>>> pl.show()

tmp

akaszynski
akaszynski previously approved these changes May 4, 2023
@akaszynski akaszynski mentioned this pull request May 4, 2023
6 tasks
@kmarchais
Copy link
Contributor Author

kmarchais commented May 4, 2023

Works perfectly for me!
I fixed a typo in the test file to scale by the correct array and suggest scaling down the array to avoid overlaps.
But I don't know how to update the test image properly...

Also if possible, it would be nice to be able to pass the array (sequence[float]) to the scale_array like in pyvista/core/filters/data_set.py in the glyph method.

@akaszynski
Copy link
Member

Also if possible, it would be nice to be able to pass the array (sequence[float]) to the scale_array like in pyvista/core/filters/data_set.py in the glyph method.

I thought of this as well and I agree that it seems like a good idea. However, I'm wary of allowing two input types but returning only one output type. While this would be more convenient, it's likely this would lead to confusion since this is a property (getter/setter) and I'd prefer to avoid side effects.

@akaszynski
Copy link
Member

@kmarchais, thanks for fixing the test. I ran it locally and it looks like the image is correct. I also verified that the scale indeed works and realized that we should raise if the point data array does not exist. Otherwise, it looks like it just scales everything to 1.

banesullivan
banesullivan previously approved these changes May 4, 2023
>>> pl.show()

"""
self.scale_factor = 1.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VTK allows for both of these settings to be applied independently, and the default for scale_factor is already 1.0 in vtk, but I haven't chased down the default in the pyvista code. This is a very useful feature. For example, I have data from a simulation that has particle size in mm stored in point data as "diameter", but the mesh positions are in units of m. I would want to set the scale_factor to 0.001 and scale_array to "diameter" to render this correctly.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yes, this shouldn't be hard set here

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And this is in the getter? Seems like a mistake

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.scale_factor = 1.0

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function use_circular_spat does self.scale_factor *= 1.5 so I wanted to cancel it at first without modifying the function, I think it must be done somewhere

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit of an interconnected rabbit hole.

This line seems mysterious to me:

self.mapper.scale_factor = prop.point_size * self.mapper.dataset.length / 1300

Then there is this line, which as @kmarchais points out further modifies scale_factor if a different option is used:

self.mapper.use_circular_splat(prop.opacity)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't really define what point_size units are in our documentation. I have not been able to use this new points-gaussian feature successfully and have kept using glyphs as it isn't clear how to get the right physical mesh units for rendering. I don't want to further complicate this PR which was intended to be a simple addition, however. But I'm not sure if there is a simple fix. Maybe it would be be best to keep the current state, i.e. hard code scale factor, but in the setter. Then raise an issue to disentangle this complexity in a more holistic PR?

@akaszynski
Copy link
Member

As much as I'd like to get this out in v0.39.0, I feel this feature isn't quite complete. I agree with @MatthewFlamm, we really need to overhaul our point size documentation and implementation.

For the moment, @kmarchais, I'm afraid you'll have to use the VTK API to scale these points (or this branch).

@akaszynski
Copy link
Member

@kmarchais, I'd like to have this in the next release (0.43), so please ping me when it's ready for review.

@kmarchais
Copy link
Contributor Author

@akaszynski I've just updated this pull request to incorporate the latest developments from the main branch, allowing me to use this feature. It's working as expected on my end, so I believe it's ready. However, considering our prior discussion regarding its compatibility with the point size implementation, I'm not sure if it fits perfectly with your earlier comments.

@tkoyama010 tkoyama010 marked this pull request as ready for review September 14, 2023 15:39
@akaszynski
Copy link
Member

considering our prior discussion regarding its compatibility with the point size implementation, I'm not sure if it fits perfectly with your earlier comments

The main issue we have is our current implementation is non-ideal. Point size documentation is lacking and changing this will likely be non-backwards compatible in some ways. I wasn't comfortable with including this on the last few releases since we didn't have time to let this sit on main and get usage feedback.

Provided we review and agree on the changes moving forward, I'd be happy implementing it provided this cleans up our implementation.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this test is sufficient. Our image regression comparison likely won't be able to pick up on changes to these small dots. Can you make a more visually distinct test?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With a scaling of 0.05 instead of 0.01 in tests/plotting/test_plotting.py for sphere["z"] = sphere.points[:, 2] * 0.010 we get the following result. I'm not sure how to upload the new result though.
image

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this! We'll handle updating the baseline image if you're not able.

Running the single test with the --reset_image_cache option will do it. I.e.:

pytest tests/plotting/test_plotting.py -k test_plot_points_gaussian_scale --reset_image_cache

@banesullivan
Copy link
Member

Overall, this looks okay to me and it's a greatly needed feature. @kmarchais, I sincerely apologize that it's taken so long to merge this... thank you for your patience. I propose we merge this shortly after the 0.43 release this week and let it sit on main for testing/evaluation for the 0.44 release.

@tkoyama010, can you help me to remember to merge this after the 0.43 release?

@tkoyama010
Copy link
Member

Copy. I'll schedule the merge to be done at the right time.

@MatthewFlamm
Copy link
Contributor

#4366 (comment) still hasn't been resolved. I still do not like how scale_factor is handled, but this is not a problem unique to this PR as I pointed out in that thread. I do not have the time in the short term to tease out what would be required to fix this either. Can this line be moved into the setter at the very least?

@kmarchais
Copy link
Contributor Author

Indeed, I just moved it into the setter.

The scale factor is changed in the mapper.py in use_circular_splat (*= 1.5) and use_default_splat (/= 1.5).
And in plotter.py > add_mesh(), when using style=points_gaussian and render_points_as_spheres=True it is modified by these lines:

if style == 'points_gaussian':
    self.mapper.scale_factor = prop.point_size * self.mapper.dataset.length / 1300
    if not render_points_as_spheres and not self.mapper.emissive:
        if prop.opacity >= 1.0:
            prop.opacity = 0.9999  # otherwise, weird triangles

    if render_points_as_spheres:
        if style == 'points_gaussian':
            self.mapper.use_circular_splat(prop.opacity)

I don't fully comprehend this part but that's why I need to reset it to 1.0 to use scale_array.

@tkoyama010
Copy link
Member

github-actions preview

Copy link
Contributor

github-actions bot commented Dec 4, 2023

@banesullivan banesullivan self-requested a review December 9, 2023 03:53
Copy link
Member

@banesullivan banesullivan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's time to merge this! If any problems or issues arise from this change, we will hopefully catch them early in this next release cycle.

I made a few changes to finalize this:

  • Improved the image regression test so that it's more clear what's expected
  • Moved the example into the "Plotting" category
  • Ensured this example would not be interactive in our generated documentation
  • Minor docs changes

@banesullivan banesullivan merged commit 93fc70a into pyvista:main Dec 23, 2023
24 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Changes that enhance the library
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants