New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is the ground truth mosh data of h36m incorrect? #50
Comments
Sorry, I didn't notice the rotate_base option in def batch_global_rigid_transformation. |
@zycliao but their released model file seems trained with rotation_base flag OFF, and obviously, the ground truth can only be used when the flag is ON. So I wonder if they really use H3.6M SMPL parameter ground truth or not! Could you share more info if you know how they train the model? |
@akanazawa Do you set the rotate_base flag on when you train the model with all the datasets? But you trained model only predict 3d joints with correct orientation when rotate_base is False. Thanks so much for resolving my puzzle. |
Hi, Sorry for the confusion, the global rotation of Mosh is not aligned to the image, it's whatever coordinate frame the raw mocap markers were in (not synchronized with image coordinate frame). It could be some simple transformation but you'd have to use the camera to adjust for each viewpoint. the rotate_base flag should be off in all experiments, it's an artifact from something else I did long time ago :p Hope this clears things up. Best, Angjoo |
Sorry! I thought you were referring to the moshed files that I use for the prior. Those I don't think the global rotation is not aligned to the image. But I get now that you're talking about H36M tfrecords. These, I computed the correct global rotation using the ground truth 3D joints so they should align. But there is something wrong in the data, as you noticed it is up-side down. I have retried training with this correct ground truth pose, but there was only trivial change in performance, so I haven't updated the dataset yet.. I should. I'd welcome any help documenting this down and adding it to README. Anyway sorry for your troubles. Here's the code to rectify them:
|
Thank you very much! I am impressed by your quick reply!
I will see what I can do to the README file.
Wish you a good day!
…On Tue, Jan 22, 2019 at 7:36 PM akanazawa ***@***.***> wrote:
Sorry! I thought you were referring to the moshed files that I use for the
prior. Those I don't think the global rotation is not aligned to the image.
But I get now that you're talking about H36M tfrecords. These, I computed
the correct global rotation using the ground truth 3D joints so they should
align. But there is something wrong in the data, as you noticed it is
up-side down.
You can use the function below to correct the pose. Sorry this is very
confusing, but this was actually the data I trained my models. After all
the experiments and making the code public I realized that the gt3d is
upside down (due to the way we were processing the data previously, hence
that rotate_base flag, etc).
I retried training with this correct ground truth pose, but there was only
trivial change in performance, so I haven't updated the dataset yet.. I
should. I'd welcome any help documenting this down and adding it to README.
Anyway sorry for your troubles. Here's the code to rectify them:
def rectify_pose(pose):
"""
Rectify "upside down" people in global coord
Args:
pose (72,): Pose.
Returns:
Rotated pose.
"""
pose = pose.copy()
R_mod = cv2.Rodrigues(np.array([np.pi, 0, 0]))[0]
R_root = cv2.Rodrigues(pose[:3])[0]
new_root = R_root.dot(R_mod)
pose[:3] = cv2.Rodrigues(new_root)[0].reshape(3)
return pose
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#50 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AOJqU8Up-AIEzHujZPsGAKX1GPdzJegkks5vF1owgaJpZM4YP_9c>
.
|
Np :)! Thanks for your interest!! |
Hi @akanazawa , I have access to Human3.6M moshed dataset, what is the different processing between *.pkl and the *_camx_aligned.pkl, or could you tell me where I can find the process scripts? I want to get the continuous smpl mesh. |
im not exactly sure, but probably the aligned one is in the camera coordinate space of the said camera. you probably just want to see the *.pkl. There should be a dict with something like |
@akanazawa Thanks for your quick reply! I checked the moshed data, the *_camx_aligned.pkl was downsampled with 5, and the *.pkl is continuous. The only difference between *_camx_aligned.pkl and *.pkl is the root orientation. I want to do the alignment from *.pkl data to *_camx_aligned.pkl, but i'm not sure how to do the transformation. Could you please give me some advice? |
I tried the following code and the visualization results show that the global rotation of the ground truth mosh data doesn't correspond to the image. Did I get something wrong or the data is incorrect?
The text was updated successfully, but these errors were encountered: