Pre-processed data #1
Comments
actually, it is same as the original repo. It is the normalized data. best. |
So it's the same as running their data loading and then preprocessing code (e.g. including projection, keypoint exclusion, etc)? Thanks |
yes |
Thanks for all your help. When I load the data with your code it is the following size: From the original tensorflow the train is the same but the test is bigger: Any ideas why it might be different. |
sorry, I have fixed this and updated the data. The videos provided by the Human3.6M contain a damaged video, so the test set is less if using stacked hourglass to predict 2d pose, and I mistakenly missed this one when processing groundtruth data. I will upload the data processing code. |
Thanks |
@weigq, @macaodha Btw, as a supplementary information; This action has no video in the Human3.6M dataset, These below actions has smaller annotations than expected in the global coordinate, not image plane Good Work! |
@salihkaragoz |
I will closed this issue, you can reopen it if needed. |
Hi there,
Your code looks great. I was just wondering what is the main difference between your pre-processed dataset and
h36m.zip
from the original tensorflow repo.Thanks
The text was updated successfully, but these errors were encountered: