Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About two different loss while in training progress #62

Open
DISAPPEARED13 opened this issue Aug 30, 2022 · 1 comment
Open

About two different loss while in training progress #62

DISAPPEARED13 opened this issue Aug 30, 2022 · 1 comment

Comments

@DISAPPEARED13
Copy link

Hi there, I download this code and adapted them for my semi-supervised segmentation (Pytorch version). And thanks for this genius code you provided!

But I have a question is that I know that mean-teacher model contains two loss, one is for unsupervised loss for paired labeled data and ground-truth, and the other is for contrast loss.

And here is the contrast loss I calculated:

  1. get the consistency weight by 10 * sigmoid_rampup(epoch, 5) at each epoch
  2. compute the logits from student and teacher's output

And then I got contrast loss up to thousands and it seems not right. Is it normal or some bug in my code?
Could you give me some advice if you have some idea? Thanks!

@DISAPPEARED13
Copy link
Author

Hi, there. I am here to update my problem. For debugging and I found that the problem is from softmax_mse_loss after change it to nn.MSEloss() the problem of loss up to thousands is gone.

But it's still confused me that,

will contrast loss is the regularization for student model and teacher model, so it won't decrease too low or increase too much, just oscillate around e-4 or something else?

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant