Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom regressor for additional keypoints #148

Closed
russoale opened this issue Jul 5, 2020 · 7 comments
Closed

Custom regressor for additional keypoints #148

russoale opened this issue Jul 5, 2020 · 7 comments

Comments

@russoale
Copy link

russoale commented Jul 5, 2020

Hi @akanazawa,

I'm working on a tool trying to eliminate learning a new regressor but rather just clicking certain vertices and therefore defining a new keypoint, i.e.
Screenshot 2020-07-03 at 17 11 54

Then by having the newly defined joint just selecting the N closest vertices and solving the linear matrix equation using least-squares solution.

Could you explain why the original cocoplus regressor is normalized 0 to 1 where the vertices are not?

@akanazawa
Copy link
Owner

Hi, looks like a neat tool!!

Sorry I forget most of the details and don't have access to the code right now. I think we used some kind of sparse linear regressor from scipy or something like that. If you're talking about the weights per joint normalized I think it makes sense for them to sum to 1 and therefore in that range! Best!

@longbowzhang
Copy link

Hi @akanazawa
sorry to bother you again.
I also feel curious about how to obtain the regressor (e.g. cocoplus_regressor or J_regressor_h36m).
Could you provide some codes or the related process?
Thanks a lot in advance.

@russoale
Copy link
Author

Hi @longbowzhang ,

you can check my repo. I have checked the original SMPL2015 paper again and found that they where using non-negative least squares. Originally the mesh was hand segmented in 24 parts and then optimized to a sparse set of vertices and associated weights influencing each joint.
I have just release my interpretation of the tool. So feel free to have a look.

@longbowzhang
Copy link

Hi @russoale thanks very much for your reply and your repo.
Just a minor question. As I far as I know, the associated weights for each joint are summed up to one. Can non-negative least squares guarantee that?

@russoale
Copy link
Author

Good point. I'm currently using scipy's nnls implementation which doesn't allow adding constraints.
But you can add another equation that weights should sum up to 1 to the system of linear equations.

@longbowzhang
Copy link

Hi @russoale,
Sorry to bother you again. I have a question w.r.t the the implementation of the Discriminator.
In function Discriminator_separable_rotations, I found that there is no activation_fn (e.g., relu) used.
I think this conflicts with the paper, right?

@russoale
Copy link
Author

russoale commented Aug 7, 2020

Which layer are you referring to? As far as I have checked the code should be consistent with the paper.
In Discriminator_separable_rotations tensorflow contrib's slim package was used. Check the default parameters of the layer's and take a look as well into the arg_scope.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants