Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory crash from compute_volumetric_geodesic.py #50

Closed
artursahak opened this issue Jun 30, 2021 · 5 comments
Closed

Memory crash from compute_volumetric_geodesic.py #50

artursahak opened this issue Jun 30, 2021 · 5 comments

Comments

@artursahak
Copy link

Dear zhan-xu, I tried to run compute_volumetric_geodesic.py on my own dataset(much like the preprocessed folder you have), but here is my issue. Unable to allocate 76.2 GiB for an array with shape (1136630306, 3, 3)

Traceback (most recent call last):
  File "compute_volumetric_geodesic.py", line 173, in <module>
    one_process(dataset_folder, start_id, end_id)
  File "compute_volumetric_geodesic.py", line 130, in one_process
    pts_bone_visibility = calc_pts2bone_visible_mat(mesh_ori, origins, ends)
  File "compute_volumetric_geodesic.py", line 64, in calc_pts2bone_visible_mat
    locations, index_ray, index_tri = RayMeshIntersector.intersects_location(origins, ray_dir + 1e-15)
  File "C:\Anaconda3\envs\rigger\lib\site-packages\trimesh\ray\ray_triangle.py", line 107, in intersects_location
    **kwargs)
  File "C:\Anaconda3\envs\rigger\lib\site-packages\trimesh\ray\ray_triangle.py", line 66, in intersects_id
    triangles_normal=self.mesh.face_normals)
  File "C:\Anaconda3\envs\rigger\lib\site-packages\trimesh\ray\ray_triangle.py", line 244, in ray_triangle_id
    triangle_candidates = triangles[ray_candidates]
numpy.core._exceptions.MemoryError: Unable to allocate **76.2 GiB** for an array with shape (1136630306, 3, 3) and data type float64
@zhan-xu
Copy link
Owner

zhan-xu commented Jul 1, 2021

Hi, could you try to reduce the number of faces in the remeshed OBJ files? I modified a bit weeks ago in geometric_proc/compute_pretrain_attn.py . Take a look at Line 209-210

if subsampling:
     mesh = mesh.simplify_quadric_decimation(3000)

You can do similar things here. In general, this is because your remeshed OBJ file has too many faces. I will also modify here when I have more time.

@artursahak
Copy link
Author

Yes, thank you. I manually updated models with meshlab to contain 3000 faces.
In the meantime I have an invalid literal int for files in train folder after gen_dataset.py worked.
File "C:\JavaTemp\RigNet\datasets\skeleton_dataset.py", line 107, in process name = int(v_filename.split('/')[-1].split('_')[0]) ValueError: invalid literal for int() with base 10: 'C:\\JavaTemp\\Dataset\\train\\1'

Would be great to hear your advice.

@artursahak
Copy link
Author

In addition to that the rigs are "mixamo" rigs. Can it affect anyhow the result and spawn the error?

@zhan-xu
Copy link
Owner

zhan-xu commented Jul 17, 2021

Instead of ".split('/')", you may need ".split('\')", also check the value of "v_filename.split('/')[-1].split('_')[0]". You can modify the code to adapt to your path.
I didn't try mixamo data on rignet before. I assume you might need to tune the threshold to for it.

@zhan-xu zhan-xu closed this as completed Jul 20, 2021
@artursahak
Copy link
Author

Thank you very much, all the files are organized and trained, but there is another issue which I am working hard on(opened an issue :D).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants