Skip to content
This repository has been archived by the owner on May 28, 2024. It is now read-only.

Hello, can I use it for multi label classification? If so, what should I pay attention to in the process of tag prediction? For multi label classification, sigmoid is generally used as the loss function. In this case, can you change your loss function to sigmoid? #39

Closed
ghost opened this issue Dec 15, 2020 · 3 comments

Comments

@ghost
Copy link

ghost commented Dec 15, 2020

Hello, can I use it for multi label classification? If so, what should I pay attention to in the process of tag prediction? For multi label classification, sigmoid is generally used as the loss function. In this case, can you change your loss function to sigmoid?

@david-berthelot
Copy link
Contributor

We didn't MixMatch in this setting. One could see a multi-label classification loss as multiple binary losses to model the problem in a compatible way with MixMatch. I guess I can't really give more advice than that and experimenting with the code on your data.

@ghost
Copy link
Author

ghost commented Dec 18, 2020

Thank you for your reply. I used my own multi classification dataset (not multi label classification) to train mixmatch. The first few rounds of training worked well,after the first few epochs of network learning, the loss began to rise, the precision decreased, and many parameters were adjusted. Could this be due to the label prediction error of unlabeled samples, which led to the worse training effect of the model?

@david-berthelot
Copy link
Contributor

It could be a lot of things, it's hard for me to tell. This repository is mostly to show how to reproduce the results achieved in the paper.

@carlini carlini closed this as completed Jun 23, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants