Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Enhancement] Motion retargeting from OpenPose JSON output #54

Open
aitikgupta opened this issue Jul 7, 2020 · 4 comments
Open

Comments

@aitikgupta
Copy link

First of all, thanks for all the work.
Currently we can use style transfer from JSON output as given in README.md

Can we use the JSON output from OpenPose, for Motion retargeting?
So the pipeline will be, video -> json_output_from_openpose -> (bvh?) -> any_of_the_given_mixamo_models

Referenced issues: #34 , #31

@aitikgupta aitikgupta changed the title Motion retargeting from OpenPose JSON output [Feature Enhancement] Motion retargeting from OpenPose JSON output Jul 7, 2020
@PeizhuoLi
Copy link
Collaborator

Thank you for your interest. Unfortunately, converting joint position (json_output_from_openpose) to joint rotation (bvh) is not a trivial task and the latter representation is necessary for driving skinned model.

Although reconstructing 3D animation from monocular video is beyond the scope of this library, it is a popular research area and there are some nice works. You could try to integrate other projects and train your own model.

@aitikgupta
Copy link
Author

Thanks for the definitive answer, I have 1 logical question though, might be a very beginner-mindset question.

Can't we use relative rotations / movements in an animation (one .bvh which contains one type of armature doing something) and impose those relative notations onto a different armature (one .fbx which contains a skinned model with its own armature, but no animation) without using a trained model?

@PeizhuoLi
Copy link
Collaborator

I'm not quite sure what does relative rotation mean. But I think that's exactly what motion retargeting does: transfer an animation from one armature to another different armature. And of course, there exists related works without using deep learning method and possibly can deal with unseen armature.

@aitikgupta
Copy link
Author

By relative rotation I meant, for example relative rotations and movements in parent-child structures, so effectively we'll have a core bone, and other bones' relative rotations to represent the whole structure, and how this would help is, it would standardize over almost any armature type. (Just a vague beginner idea, could be completely wrong here).

Apart from that, would you be kind enough to point me to some other existing related works? I tried exploring, but in vain.
Thanks anyways. 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants