-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Speed] Large Mesh #22
Comments
Hi asml, My meshes are generally in the order of 10,000-100,000 faces, and take only a second or so to construct the octree. Your 3 million faces could slow things down, but > 1 hour sounds excessive. One possible issue is during the conversion of your Python array to C++ array, which is handled by the Cython wrapper. If you are willing to share your code and mesh, I would be happy to have a look to find a bottle neck there or elsewhere. Another suggestion is that you decimate your mesh first to reduce to tri face count before running pyoctree. Cheers, |
Thanks for your quick reply (and awesome project, by the way :))! You're right, after simplifying the mesh to 150k vertices/300k faces, tree construction takes just a couple of seconds. I cannot share the mesh, unfortunately, because it's confidential data, but the actual call to pyoctree is simply ot.PyOctree(self.mesh.verts,self.mesh.faces), where the former is a 1.5 million by 3 float array and the latter a 3 million by 3 int32 array. No surprises there. Memory shouldn't be an issue either. It seems like something doesn't scale linearly. |
Hi Andreas, I will look into this in the next few days. The obvious thing here is the use of a contiguous array to hold the tree data. Currently I am appending to an stl vector, which is probably contiguous up to a certain point, but after a certain number of polys will be scattered around in the memory leading to the speed reductions you are observing. Cheers, |
I have a mesh with around 1.5 million vertices/3 million faces. Is pyoctree suitable for meshes this large? Tree creation has been running for over an hour and I was wondering if that's to be expected or if something is wrong.
The text was updated successfully, but these errors were encountered: