-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question on fine-tuning for face pose evaluation #59
Comments
The code to fine-tune on 300W-LP is the same as in here, we just didn't release the annotations. If you prefer, the fine-tuned pre-trained model can be found here. |
Sorry, you missed the link in the first sentence. I cannot see it. |
Edited. |
So would you release the annotations? As I see, without fine-tuning, the network performance is very poor. Performance comparison on AFLW2000:
|
We have no plans on releasing these annotations at this time. The performance only appears to be poor because of a problem with Euler angles. Apart from evaluating with Euler angles, we recommend using the model without fine-tuning. |
Thank you so much. ❤️ |
So what if we use 'xyz' and 'zxy' to decode the output at the same time, and choose the best as the final result? (I mean, for each face.) |
We could do that, but to be fair with other models compared to, we only use xyz. |
Alright, thank you. |
Happy new year! As I have tried. Even though I choose the minimum error between
So, the fine-tuning plays an important role in the pose evalution. |
Thanks, to you too! Have you also converted the GT to Regardless, the fine-tuning tries to constrain the poses learned to less than 90 degrees, that's why it performs better when tested with Euler angles. |
OK, I retest the model without fine-tuning, the results are as following:
|
I still consider this result as a big gap. 🐰 |
If what you care most is pose evaluation in Euler angles, then sure, use the fine-tuned model. |
As your paper declares, for face pose evaluation, you fine-tune the model on 300W-LP dataset. However, I cannot find the corresponding code in your repository, did I miss it?
The text was updated successfully, but these errors were encountered: