Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to get h36m_single_train_openpose.npz? #20

Closed
BICHENG opened this issue Oct 29, 2019 · 13 comments
Closed

How to get h36m_single_train_openpose.npz? #20

BICHENG opened this issue Oct 29, 2019 · 13 comments

Comments

@BICHENG
Copy link

BICHENG commented Oct 29, 2019

What's the data included in h36m_single_train_openpose?
How to generate h36m_single_train_openpose.npz?
Where can I find h36m_single_train_openpose.npz?

Any example?

@geopavlakos
Copy link
Collaborator

This file includes the data for Human3.6M and is similar to the other .npz files we released. Besides image names, scale/center and 2D/3D/openpose joints, it also includes pose and shape parameters for the Human3.6M dataset. Due to the dataset license, we cannot release it, but if you have the data, it should be straightforward to produce it, if you also take a look to the other dataset preprocessing scripts we provide.

@BICHENG
Copy link
Author

BICHENG commented Oct 30, 2019

Could you please add this h36m script to this repository?

@geopavlakos
Copy link
Collaborator

The script is already in SPIN/datasets/preprocess/h36m_train.py. I updated it to include the reading of the openpose predictions, but you will still need to read and include the SMPL pose and shape parameters in the .npz file (in case you have access to them).

@liwenssss
Copy link

if I have h36m dataset , how can I generate pose and betas paramers?

@geopavlakos
Copy link
Collaborator

We used pose and shape parameters generated by the MoSh method. However, due to license implications, this data is not public. You could try to fit SMPL to the 3D keypoints of the dataset. This would probably be less accurate, but it should give you reasonable results.

@liwenssss
Copy link

Do you generate pose and shape parameters of MPI-INF-3DHP in the same way?

@geopavlakos
Copy link
Collaborator

No MoSh results are available for MPI-INF-3DHP. In that case, we fit SMPL to keypoints. Since ground truth is not very accurate, we prefer to use OpenPose predictions and fit SMPL to them for all viewpoints simultaneously.

@liwenssss
Copy link

liwenssss commented Dec 6, 2019 via email

@geopavlakos
Copy link
Collaborator

The code for the MoSh approach is not publicly available. You could refer to the description of the paper for more details.

@GuiZhaoyang
Copy link

@geopavlakos hi,Could I know how to fit SMPL to the 3D keypoints of the h36m dataset?Because I have not the Mocap annotation but I have the 3D keypoints annotation.May I know the code to to fit SMPL to the 3D keypoints?

@vigorbird
Copy link

@GuiZhaoyang 兄弟留个邮箱,有偿求human3.6m mosh数据

@EveningLin
Copy link

@geopavlakos @BICHENG @liwenssss @vigorbird @GuiZhaoyang
where can i get the json file in the code
json_file = os.path.join(openpose_path, 'coco', imgname.replace('.jpg', '_keypoints.json'))

@GloryyrolG
Copy link

No MoSh results are available for MPI-INF-3DHP. In that case, we fit SMPL to keypoints. Since ground truth is not very accurate, we prefer to use OpenPose predictions and fit SMPL to them for all viewpoints simultaneously.

Hi @geopavlakos et al., Could you kindly release openpose detection of these data please? Thx & best,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants