-
Notifications
You must be signed in to change notification settings - Fork 395
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom regressor for additional keypoints #148
Comments
Hi, looks like a neat tool!! Sorry I forget most of the details and don't have access to the code right now. I think we used some kind of sparse linear regressor from scipy or something like that. If you're talking about the weights per joint normalized I think it makes sense for them to sum to 1 and therefore in that range! Best! |
Hi @akanazawa |
Hi @longbowzhang , you can check my repo. I have checked the original SMPL2015 paper again and found that they where using non-negative least squares. Originally the mesh was hand segmented in 24 parts and then optimized to a sparse set of vertices and associated weights influencing each joint. |
Hi @russoale thanks very much for your reply and your repo. |
Good point. I'm currently using scipy's |
Hi @russoale, |
Which layer are you referring to? As far as I have checked the code should be consistent with the paper. |
Hi @akanazawa,
I'm working on a tool trying to eliminate learning a new regressor but rather just clicking certain vertices and therefore defining a new keypoint, i.e.
![Screenshot 2020-07-03 at 17 11 54](https://user-images.githubusercontent.com/15835919/86481494-69e6fb00-bd50-11ea-8d34-8791d3f02ab2.png)
Then by having the newly defined joint just selecting the
N
closest vertices and solving the linear matrix equation using least-squares solution.Could you explain why the original cocoplus regressor is normalized 0 to 1 where the vertices are not?
The text was updated successfully, but these errors were encountered: