Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

code for shape-consistency loss #22

Closed
xianzhongma opened this issue Jan 19, 2021 · 3 comments
Closed

code for shape-consistency loss #22

xianzhongma opened this issue Jan 19, 2021 · 3 comments

Comments

@xianzhongma
Copy link

I understand the idea behind the shape-consistency loss and think it is reasonable. But I feel confused how to write the code. According to the paper, shape-consistency loss is the loss of exchanging the beta for two example of the same identity. However, batch size is usually greater than 2, so how to deal with a batch of examples? Any suggestions and comments are welcomed. Thanks.

@TimoBolkart
Copy link
Collaborator

The training code is not available, so please find the implementation details in the code

@hillaric
Copy link

can you speak detail about the consistency loss? In the trainer.py (L 125), I can not understand how you calculate the loss?

@bxiong97
Copy link

The training code is not available

Do you mean it is now available?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants