New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The adv_loss curve is strange. #39
Comments
Hi, It seems that your discriminator is pretty converged. Usually, we would expect than loss_D is around 0.2-0.5 when adversarial training is properly working. What kind of data are you using? |
The D_loss is way too low. There must be some unsymmertric statistics between your GT and pred so that D can easily differentiate it. The other possibility is that the adversarial loss is not trained properly. btw, when D_loss is low, the adv_loss should be pretty high. This part is a bit weird. |
Thank you! My GT and pred are the same types as the usual segmentation task, the label is one-hot format got from the BxHxW tensor label with 4 classes. I'll check my code again. But what is strange is that although the loss is not normal in the first experiment as the first figure shown above, it finally got a good result than experiment without adversarial training. And the second one got a much worse result than experiments without adv. |
I see. I don't have any new suggestions other than looking into why adv_loss and D_loss are both low while they should be competing against each other. |
Thanks for your early reply! |
Hi, sorry for not following up in time. I'm closing this issue for now. Feel free to shoot me an email if there is any more question regarding this work. |
Hi! First, I'd like to thank you for the very helpful repo. During training with the same discriminator as yours, I have met some problems which I can't figure out. 1. The discriminator loss of pred and GT are almost unchanged for most of the time. So I am wondering if this phenomenon is normal. Btw, I haven't added semi data yet.
The text was updated successfully, but these errors were encountered: