-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
getting manual alignment coordinates #5
Comments
Hi @yashgarg98, I have used the steps described here to align human and scenes. And then you can update the transformation parameters defined here: https://github.com/apple/ml-hugs/blob/main/hugs/datasets/neuman.py#L89. |
Got it thanks, I just have another doubt when I'm trying to add the humans into my own trained background scene, the colors/texture on the humans is slightly transparent or more like a blend of their clothes with background pixel colors. Is there any hyperparameter or way to control this or resolve this? I can upload some sample images if needed for references to the problem |
Could you share the sample images? |
Hi, thanks for oper-sourcing this code,
I have been trying to generate custom videos using different scenes with different humans. I'm facing trouble with how to set the manual alignment coordinates (translation, rotation, and scale) such that the human is generated at my desired position in the scene. I have followed the method explained in the NEUMAN paper, where I exported point clouds (.ply) of the scene and human in a blender and set the coordinates of the human at my desired location. However, this method doesn't generate animated videos with humans at the location I want.
I'm stuck here, can you explain what should be the correct way to get the manual alignment coordinates?
The text was updated successfully, but these errors were encountered: