-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could You Please Share the Curve of Training Loss? #20
Comments
Thanks for your kindly response. |
Hi, thanks for sharing the code. |
I also got the same issue. I guess the reason leads to this situation might be that the features stored in the memory bank come from the previous epoch and leads to high loss when new features come in. But if you average the loss of each epoch, it still decays. |
I also got the same issue, which has also been discussed in #27. Yes, the average loss still decays, but I find this may hurt the performance of CMC on small datasets. |
I have the same experiment. |
Yes, I also find decreasing nce_k can improve the performance. Could you please share some experience on tuning nce_m and lr? |
Hi,
I want to use CMC in my own experiment, but the loss is strange. At each epoch, the loss decays as normal (like from 20 to 11). But at the next epoch, the loss becomes nearly the same as begining (the loss is 20 again). I wonder if it is 'normal' in CMC.
Thanks.
The text was updated successfully, but these errors were encountered: