Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some improvements to density import #383

Merged
merged 18 commits into from
Jan 2, 2024

Conversation

jojoelfe
Copy link
Contributor

Hi Brady,

here are some improvements for the density import:

  • Density can optionally be centered on import
  • Object is added to the MN collection
  • Option to smooth the surface
  • Option to hide the dust that is typical in highly sharpened maps:
Screencast.from.2023-12-15.09-58-59.webm

There are some changes coming to the volume system in geometry nodes, so I understand if you want to hold up on this for now.

Best,

Johannes

@BradyAJohnston
Copy link
Owner

Thanks for the PR!

They are in the process of overhauling the volumes inside of Blender / GN, but it might not make it into 4.1, so I welcome improvements to it earlier.

I like the improvements that you have done, especially the 'dust' removal etc.

As you know more about the data formats than me, would you be willing to also do some other improvements?

Others that I can think of:

  • automatic choosing of a good starting threshold (I'm assuming this can be guessed from the data, then we can set the default value in the created node tree) so that the user will always see something immediately, rather than having to manually tweak thresholds after import
  • An option for better normalisation of the data. Currently the volumetric display in cycles doesn't work properly. It seems to be that when imported to Blender some kind of normalisation happens, but it gets weird if there are negative values present, so masking out negative values and normalising to 1 before saving as VDB so that blender handles it better might be a good import option as well.

If you'd like to have a go that would be fab, otherwise I'll put it on the list and address it in another PR.

@jojoelfe
Copy link
Contributor Author

These are both reasonable things that would also help me out.

I've done some experiments and a minor complication is that normalization and threshold detection ideally happen before the conversion to vdb, since at this point the volume data is in memory. But then this code doesn't run the second time this volume is loaded. But I think there should be solutions.

Still after normalization of the density from 0 to 1, cycles does not work at all with the imported density. I can get reasonable things using the density_surface node followed by "Mesh to Volume", but directly feeding the density into cycles does nothing, while EEVEE does seem to recognize the vdb density

Screencast.from.2023-12-17.12-02-24.webm

🤷🏼

@jojoelfe
Copy link
Contributor Author

jojoelfe commented Dec 18, 2023

Upon further testing it looks like Cycles does not like the scaling transform applied in the vdb file.

Also, have you thought about how to write tests for the density import? An issue seems to be that there is no pyopenvdb package in pypi and the bpy module does not come with it included as opposed to a full blender install.

@BradyAJohnston
Copy link
Owner

Yep the normalisation / threshold detection would have to happen before conversion to vdb. Currently if a .vdb already exists with the same name, then I don't do the conversion again, so we could also add a check box for re-doing the conversion etc (I think maybe just changing that to the default would be logical). Currently I do it to save a bit of speed, but usually if a user is changing settings during import then they would want to regenerate the file anyway.

As for the issues with cycles, I think we just add the option for normalisation and I can look into problems with cycles more at a later date. It's a tricky issue.

@BradyAJohnston
Copy link
Owner

Additionally to your question about testing, so far as you would have noticed I have just been not testing because of the lack of pyopenvdb on pip. I did have a workflow that would download a properly bundled version of Blender, and then run the testing suite with that instead of the pip version. The plan was to add some volume testing through that workflow exclusively. (https://github.com/BradyAJohnston/MolecularNodes/blob/dev-render/.github/workflows/releases.yml). In the last month or so Blender was the target of a pretty major & sustained DDoS attack. To combat it, they changed a bunch of their web infrastructure which now breaks the auto downloading of bundled releases. We could try just downloading some specific releases (or maybe just a linux release) and hosting that on another repo, which we could use in the workflow to run a specific test suite.

If you'd like to have a go, by all means, but I'm happy to just manually test this for now. I have been meaning to post on the Blender forums asking if it will be possible to use pyopenvdb through the pip version at all, as it seems to still handle importing of vdb objects so it should be in there somewhere.

@jojoelfe
Copy link
Contributor Author

I have a solution for the recreation of the vdb file that stores the information whether the volume was inverted or normalized in vdb metadata and recreates the vdb file if necessary.

As for the cycles problem, the problem seems to be that cycles doesn't like the negative scaling factor on the Z axis. I'll try to find another way of doing that. Is the rotation around Y and inversion of Z needed because X and Z are swapped, due to the numpy Z,Y,X notation?

@BradyAJohnston
Copy link
Owner

I think cycles doesn't like the idea of negative data at all, so we could also just mask out negative density when importing.

The rotations are just the ones I have to apply to get it to align with the structures from the PDB, I, I basically just tried a bunch of them until I got them to work.

Copy link

codecov bot commented Dec 20, 2023

Codecov Report

Attention: 2 lines in your changes are missing coverage. Please review.

Comparison is base (7dad494) 74.31% compared to head (0893c2d) 76.28%.

Files Patch % Lines
molecularnodes/io/density.py 93.75% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #383      +/-   ##
==========================================
+ Coverage   74.31%   76.28%   +1.96%     
==========================================
  Files          32       32              
  Lines        3029     3057      +28     
==========================================
+ Hits         2251     2332      +81     
+ Misses        778      725      -53     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@jojoelfe
Copy link
Contributor Author

I got it working in cycles. The problem was the -1 in this:

grid.transform.scale(np.array((-1, 1, 1)) * world_scale * voxel_size)

The -1 (and the rotation) can be removed by applying this before the conversion to vbd

volume = np.copy(np.transpose(volume, (2,1,0)), order='C')

The thing that made this hard is that openvdb seems to just straight up copy the data from numpys memory without checking any of the striding information. The np.copy with order 'C' forces numpy to layout things in memory the way openvdb expects it.

The threshold is now set to the 0.995 precentile of the map, which appears to work reasonably well on a variety of maps. The normalization sets the values to 0-10, since blender number choosing widgets defaults to 0.1 increments and that works reasonable well for that.

There is some cleanup still needed:

  • Make centering work
  • Make normalization optional
  • Make caching of vdb files optional

@jojoelfe
Copy link
Contributor Author

This is a simple test with the density from the test data:

image

@BradyAJohnston
Copy link
Owner

Thanks for figuring this out! There are a few other places that I have to specify order = 'C' or it instantly crashes blender, so I'm not surprised that was the culprit in the end.

@BradyAJohnston
Copy link
Owner

Nice work getting the test to run on Linux, actually having a test will be a big relief!

@BradyAJohnston
Copy link
Owner

Current implementation works pretty great!

image

RE: normalisation. Again this area of data isn't my expertise, but would it be better to keep the same 'units' and 'scale' that are used in other software and the data itself? So if we don't do the normalisation, we can still mask for below zero, then the same threshold value that someone uses in other software will get them the same result in MN? Especially if someone is trying to recreate a figure etc to match other software.

@jojoelfe
Copy link
Contributor Author

Nice work getting the test to run on Linux, actually having a test will be a big relief!

Yes, quite pleased that this works. It's not super useful, because one cannot test the actual geometry being created (nor can the modifier be applied). This seems to be a problem with the current volume implementation and might change with the new system. So I don;t think it makes too much sense to put a lot of effort into this now.

Your render looks super nice. I wonder if we should include one or two styles for volume based rendering. Would you mind sharing a screenshot of the material node tree you used?

RE: normalisation.

Yes, it definitely should be an option to keep the values of the original map because people often know good thresholds for their maps at hand. The reason I chose normalizing from 0 to 10 was that otherwise dragging or clicking on the arrows of the Threshold chooser has quite sudden effects. Maybe there is a way of modifying the default stepsize (I think it's 0.1). I also should double check whether masking negative values is strictly necessary.

@BradyAJohnston
Copy link
Owner

I just had a play around, and if you create another object (say a cube) and use a GN modifier that gets the info from the volume object, then you can apply the modifier / get the mesh from it as its a 'mesh' object that just gets the resulting mesh from the volume object.

CleanShot 2023-12-24 at 13 44 48@2x

It's a bit of a run around until they improve the volume handling in GN.

I also should double check whether masking negative values is strictly necessary.

If it ended up being the transforms that I was applying that was the problem, then maybe we can get away without having to mask anything.

I'll come back to this and do a more detailed review after Christmas. Thanks for all of the work on it!

fix eval using mesh
@jojoelfe
Copy link
Contributor Author

Hey,

sorry I've just been testing things and haven't really come back here.

  • The tests seem to be working. I saw you cleaned things up, which is nice. Since you know create the debug object in code, we can probably delete the object from the data file.
  • I think negative density values are no problem, so I wonder if we should just remove the normalization option. I have it set to normalize between 0 and 1, which I thought might be neat to have consistent values for shaders, etc. But it might actually be easier to do this normalization in node-trees.

Happy new year!

Johannes

@BradyAJohnston
Copy link
Owner

I removed the normalize option like you suggested (now it's all working properly I like the idea of just importing data and normalizing yoruself in the shader / node tree etc).

@BradyAJohnston BradyAJohnston merged commit a768356 into BradyAJohnston:main Jan 2, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants