Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question About mv-loss #8

Closed
ky941122 opened this issue Dec 15, 2020 · 1 comment
Closed

Question About mv-loss #8

ky941122 opened this issue Dec 15, 2020 · 1 comment

Comments

@ky941122
Copy link

Hi, I have recently read your MLCR paper and am very interested in your ideas. But I'm a little confused about the mv-loss.
Why don't you just constrain the features of different views to be mutually orthogonal, but go for the classifier's weights of different views instead?

@nxsEdson
Copy link
Owner

Sorry for the late reply, and thank you for your interests in our work. There are two reasons for using the weights instead of the features: 1) the features are shared and used for multilabel classification, e.g., 12 AU classification tasks, and 2) the weights for each AU can be regarded as a representation of the features for this specific AU because the weights vector can be seen as the base vector for projection, and the projection results are the final prediction.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants