You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 28, 2024. It is now read-only.
Hello, can I use it for multi label classification? If so, what should I pay attention to in the process of tag prediction? For multi label classification, sigmoid is generally used as the loss function. In this case, can you change your loss function to sigmoid?
#39
Hello, can I use it for multi label classification? If so, what should I pay attention to in the process of tag prediction? For multi label classification, sigmoid is generally used as the loss function. In this case, can you change your loss function to sigmoid?
The text was updated successfully, but these errors were encountered:
We didn't MixMatch in this setting. One could see a multi-label classification loss as multiple binary losses to model the problem in a compatible way with MixMatch. I guess I can't really give more advice than that and experimenting with the code on your data.
Thank you for your reply. I used my own multi classification dataset (not multi label classification) to train mixmatch. The first few rounds of training worked well,after the first few epochs of network learning, the loss began to rise, the precision decreased, and many parameters were adjusted. Could this be due to the label prediction error of unlabeled samples, which led to the worse training effect of the model?
Hello, can I use it for multi label classification? If so, what should I pay attention to in the process of tag prediction? For multi label classification, sigmoid is generally used as the loss function. In this case, can you change your loss function to sigmoid?
The text was updated successfully, but these errors were encountered: