-
Notifications
You must be signed in to change notification settings - Fork 171
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to get h36m_single_train_openpose.npz? #20
Comments
This file includes the data for Human3.6M and is similar to the other .npz files we released. Besides image names, scale/center and 2D/3D/openpose joints, it also includes pose and shape parameters for the Human3.6M dataset. Due to the dataset license, we cannot release it, but if you have the data, it should be straightforward to produce it, if you also take a look to the other dataset preprocessing scripts we provide. |
Could you please add this h36m script to this repository? |
The script is already in SPIN/datasets/preprocess/h36m_train.py. I updated it to include the reading of the openpose predictions, but you will still need to read and include the SMPL pose and shape parameters in the .npz file (in case you have access to them). |
if I have h36m dataset , how can I generate pose and betas paramers? |
We used pose and shape parameters generated by the MoSh method. However, due to license implications, this data is not public. You could try to fit SMPL to the 3D keypoints of the dataset. This would probably be less accurate, but it should give you reasonable results. |
Do you generate pose and shape parameters of MPI-INF-3DHP in the same way? |
No MoSh results are available for MPI-INF-3DHP. In that case, we fit SMPL to keypoints. Since ground truth is not very accurate, we prefer to use OpenPose predictions and fit SMPL to them for all viewpoints simultaneously. |
Sorry, I can't find any tutorial about how to use MoSh to get the parameters. Can you offer some links?
…------------------ 原始邮件 ------------------
发件人: "Georgios Pavlakos"<notifications@github.com>;
发送时间: 2019年12月3日(星期二) 晚上10:43
收件人: "nkolot/SPIN"<SPIN@noreply.github.com>;
抄送: "李文盛"<573014453@qq.com>; "Comment"<comment@noreply.github.com>;
主题: Re: [nkolot/SPIN] How to get h36m_single_train_openpose.npz? (#20)
We used pose and shape parameters generated by the MoSh method. However, due to license implications, this data is not public. You could try to fit SMPL to the 3D keypoints of the dataset. This would probably be less accurate, but it should give you reasonable results.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
The code for the MoSh approach is not publicly available. You could refer to the description of the paper for more details. |
@geopavlakos hi,Could I know how to fit SMPL to the 3D keypoints of the h36m dataset?Because I have not the Mocap annotation but I have the 3D keypoints annotation.May I know the code to to fit SMPL to the 3D keypoints? |
@GuiZhaoyang 兄弟留个邮箱,有偿求human3.6m mosh数据 |
@geopavlakos @BICHENG @liwenssss @vigorbird @GuiZhaoyang |
Hi @geopavlakos et al., Could you kindly release openpose detection of these data please? Thx & best, |
What's the data included in h36m_single_train_openpose?
How to generate h36m_single_train_openpose.npz?
Where can I find h36m_single_train_openpose.npz?
Any example?
The text was updated successfully, but these errors were encountered: