Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training loss suddenly decrease at the beginning of each epoch. #14

Closed
Tebmer opened this issue Jun 22, 2022 · 4 comments
Closed

Training loss suddenly decrease at the beginning of each epoch. #14

Tebmer opened this issue Jun 22, 2022 · 4 comments

Comments

@Tebmer
Copy link

Tebmer commented Jun 22, 2022

Hi,

I use the code in codes to train model. But I find a strange phenomenon that the loss suddenly decreases at the beginning of every epoch.

So, my question is that is this the same as your training curve? and why does it suddenly decrease at the beginning of every epoch? I thought about it for a long time, but I still can't figure it out. Thanks a lot!
image

@lsy641
Copy link
Collaborator

lsy641 commented Jun 26, 2022

which version of code do you use?

Hi,

I use the code in codes to train model. But I find a strange phenomenon that the loss suddenly decreases at the beginning of every epoch.

So, my question is that is this the same as your training curve? and why does it suddenly decrease at the beginning of every epoch? I thought about it for a long time, but I still can't figure it out. Thanks a lot! image

@Tebmer
Copy link
Author

Tebmer commented Jun 27, 2022

The first one, codes directory.

@Tebmer
Copy link
Author

Tebmer commented Jul 4, 2022

@lsy641 Hi, I use the first one version code, i.e. codes directory,

@lsy641
Copy link
Collaborator

lsy641 commented Dec 28, 2022

Sorry, I didn't see the newest comment. I didn't notice a strange curve when training. BTW I fixed a bug in June 25. @Tebmer

@lsy641 lsy641 closed this as completed Dec 28, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants