-
Notifications
You must be signed in to change notification settings - Fork 184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory crash from compute_volumetric_geodesic.py #50
Comments
Hi, could you try to reduce the number of faces in the remeshed OBJ files? I modified a bit weeks ago in geometric_proc/compute_pretrain_attn.py . Take a look at Line 209-210
You can do similar things here. In general, this is because your remeshed OBJ file has too many faces. I will also modify here when I have more time. |
Yes, thank you. I manually updated models with meshlab to contain 3000 faces. Would be great to hear your advice. |
In addition to that the rigs are "mixamo" rigs. Can it affect anyhow the result and spawn the error? |
Instead of ".split('/')", you may need ".split('\')", also check the value of "v_filename.split('/')[-1].split('_')[0]". You can modify the code to adapt to your path. |
Thank you very much, all the files are organized and trained, but there is another issue which I am working hard on(opened an issue :D). |
Dear zhan-xu, I tried to run compute_volumetric_geodesic.py on my own dataset(much like the preprocessed folder you have), but here is my issue. Unable to allocate 76.2 GiB for an array with shape (1136630306, 3, 3)
The text was updated successfully, but these errors were encountered: